Copied to clipboard!
zeiver URLS
# General
| Command | Input | Description |
|---|---|---|
| Scan | Scan ODs displaying their content to the terminal. Deactivates Downloader & Recorder. More Info | |
| Verbose | Enable verbose output. More Info |
# Downloader
| Command | Input | Description |
|---|---|---|
| Page Depth | Specify the maximum depth for recursive scraping. Can also be used to traverse subpages(ODs with previous & next buttons). Depth of `1` is current directory. Default: 20. More Info | |
| Accept Files | Using [Regex], specify which files to accept for scraping. This option takes precedence over reject option. More Info | |
| Reject Files | Using [Regex], specify which files to reject for scraping. The accept option takes precedence over this option. More Info | |
| Download Only | Use Downloader only. More Info |
# Recorder
| Command | Input | Description |
|---|---|---|
| Activate Recorder | Activates the Recorder, which saves the scraped links to a file. More Info | |
| Record Links Only | Activates the Recorder. After scraping, instead of downloading the files, save the links to them. More Info | |
| Rename Record File | Changes the name of the record file. This file is where the Recorder will store the links. Default: URL_Records.txt. More Info | |
| No Stat File(s) | Prevents the Recorder from creating stat files. These files keep a record for the total amount of files downloaded along with other info. More Info | |
| No Listing of File(s) in Stats | Prevent Recorder from writing a list of file names to stat files. More Info |
# File/Directory
| Command | Input | Description |
|---|---|---|
| Input File | Read URLs from a local or external file. More Info | |
| Record Input File | Read file links from a local or external file and create a stats file based on the results. Activates Recorder. More Info | |
| Save Directory | The local directory path to save files. Files saved by the Recorder are also stored here. Default: ./. More Info | |
| Cut Directory(ies) | Ignores a specific number of remote directories from being created. Default: 0. More Info | |
| No Directories | Do not create a hierarchy of remote directories. Only available when downloading. More Info |
# Grabber
| Command | Input | Description |
|---|---|---|
| Print Headers | Prints all Response Headers received from each Request to the terminal.Option takes precedence over all other options. More Info | |
| Print a Header | Prints a specified Response Header from each Request to the terminal. Option takes precedence over all other options. More Info | |
| Print HTML | Prints the HTML Document of each URL to the terminal. More Info | |
| HTTPS Only | Use HTTPS only. More Info | |
| Custom Header [Multi] | Specify a custom HTTP header and its value, separated by a $. More Info | |
| User-Agent | The User Agent header to use. Default: Zeiver/VERSION. More Info | |
| Retries | Retry amount for a failed connection/request. Default: 20. More Info | |
| Scrape Wait Delay | Wait a specified number of seconds between each scraping request. More Info | |
| Download Wait Delay | Wait a specified number of seconds between each downloading request. More Info | |
| Retry Delay | The wait time between each failed request. Default: 10. More Info | |
| Random Scrape Delay | Wait a random amount of seconds between each scraping request. The time between requests will vary between 0.5 * Scrape Wait Delay (inclusive) to 1.5 * Scrape Wait Delay (exclusive). More Info | |
| Random Download Delay | Wait a random amount of seconds between each download request. The time between requests will vary between 0.5 * Download Wait Delay (inclusive) to 1.5 * Download Wait Delay (exclusive). More Info | |
| Timeout | Adds a request timeout for a specified number of seconds. 0 means infinite. Default: 40. More Info | |
| Redirects | Maximum redirects to follow. Default: 10. More Info | |
| New! Authentication | Send a Basic Authentication request header. Must use the username:password format. More Info | |
| Proxy | The proxy to use. More Info | |
| Proxy Auth | The basic authentication needed to use the proxy. Must use the username:password format. More Info | |
| Accept All Certificates | Accepts all certificates even invalid ones. Use at your own risk! More Info |