zeiver 

URLS


# General

CommandInputDescription
ScanScan ODs displaying their content to the terminal. Deactivates Downloader & Recorder. More Info
VerboseEnable verbose output. More Info

# Downloader

CommandInputDescription
Page DepthSpecify the maximum depth for recursive scraping. Can also be used to traverse subpages(ODs with previous & next buttons). Depth of `1` is current directory. Default: 20. More Info
Accept FilesUsing [Regex], specify which files to accept for scraping. This option takes precedence over reject option. More Info
Reject FilesUsing [Regex], specify which files to reject for scraping. The accept option takes precedence over this option. More Info
Download OnlyUse Downloader only. More Info

# Recorder

CommandInputDescription
Activate RecorderActivates the Recorder, which saves the scraped links to a file. More Info
Record Links OnlyActivates the Recorder. After scraping, instead of downloading the files, save the links to them. More Info
Rename Record FileChanges the name of the record file. This file is where the Recorder will store the links. Default: URL_Records.txt. More Info
No Stat File(s)Prevents the Recorder from creating stat files. These files keep a record for the total amount of files downloaded along with other info. More Info
No Listing of File(s) in StatsPrevent Recorder from writing a list of file names to stat files. More Info

# File/Directory

CommandInputDescription
Input FileRead URLs from a local or external file. More Info
Record Input FileRead file links from a local or external file and create a stats file based on the results. Activates Recorder. More Info
Save DirectoryThe local directory path to save files. Files saved by the Recorder are also stored here. Default: ./. More Info
Cut Directory(ies)Ignores a specific number of remote directories from being created. Default: 0. More Info
No DirectoriesDo not create a hierarchy of remote directories. Only available when downloading. More Info

# Grabber

CommandInputDescription
Print HeadersPrints all Response Headers received from each Request to the terminal.Option takes precedence over all other options. More Info
Print a HeaderPrints a specified Response Header from each Request to the terminal. Option takes precedence over all other options. More Info
Print HTMLPrints the HTML Document of each URL to the terminal. More Info
HTTPS OnlyUse HTTPS only. More Info
Custom Header
[Multi]
Specify a custom HTTP header and its value, separated by a $. More Info
User-AgentThe User Agent header to use. Default: Zeiver/VERSION. More Info
RetriesRetry amount for a failed connection/request. Default: 20. More Info
Scrape Wait DelayWait a specified number of seconds between each scraping request. More Info
Download Wait DelayWait a specified number of seconds between each downloading request. More Info
Retry DelayThe wait time between each failed request. Default: 10. More Info
Random Scrape DelayWait a random amount of seconds between each scraping request. The time between requests will vary between 0.5 * Scrape Wait Delay (inclusive) to 1.5 * Scrape Wait Delay (exclusive). More Info
Random Download DelayWait a random amount of seconds between each download request. The time between requests will vary between 0.5 * Download Wait Delay (inclusive) to 1.5 * Download Wait Delay (exclusive). More Info
TimeoutAdds a request timeout for a specified number of seconds. 0 means infinite. Default: 40. More Info
RedirectsMaximum redirects to follow. Default: 10. More Info
New! AuthenticationSend a Basic Authentication request header. Must use the username:password format. More Info
ProxyThe proxy to use. More Info
Proxy AuthThe basic authentication needed to use the proxy. Must use the username:password format. More Info
Accept All CertificatesAccepts all certificates even invalid ones. Use at your own risk! More Info