wget 

URLS


# Download

efficiency
warning
extra
CommandInputDescription
AttemptsSet number of attempts to connect to URL(s). Specify 0 for infinite retrying. Default: 20. More Info
No ClobberIf deactivated, downloading the same file in the same directory will result in the original copy of file being preserved and the second copy being named ‘file.1’. More Info
UnlinkForce Wget to unlink file instead of clobbering existing file. More Info
ResumeContinue downloading a single unfinished file. More Info
Progression IndicatorThe type of progress indicator you wish to display. Default: bar. More Info
Force Show Progression IndicatorForce Wget to display the progress bar in any verbosity. More Info
Time-StampingFor each file, Wget will check whether a local file of the same name exists. If it does, and the remote file is not newer, Wget will not download it. If the local file does not exist, or the sizes of the files do not match, Wget will download the remote file no matter what the time-stamps say. More Info
Limit RateLimit the download speed. Bytes (default), Kilobytes with the ‘k’ suffix, or Megabytes with the ‘m’ suffix. More Info
WaitWait the specified number of seconds between each download. Seconds(default), Minutes = "m", Hours = "h", Days = "d". More Info
Retry WaitWait between retries of failed downloads in seconds. More Info
Random WaitThis option causes the time between requests to vary between 0.5 and 1.5 times wait seconds, where wait was specified using the ‘--wait’ option. More Info
QuotaOnly works when downloading more than one file. Specify download quota for automatic retrievals. Value can be specified in Bytes (default), Kilobytes (with ‘k’ suffix), or Megabytes (with ‘m’ suffix). More Info
SpiderWget will behave as a Web spider, which means it will not download the pages, just check that they are there. Not as functional as a real web spider. More Info
UsernameUsername for both FTP and HTTP file retrieval. More Info
PasswordPassword for both FTP and HTTP file retrieval. More Info

# Directory

efficiency
warning
extra
CommandInputDescription
No DirectoriesDo not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering. More Info
Force DirectoriesCreate a hierarchy of directories, even if one would not have been created otherwise. More Info
No Host DirectoriesDisable generation of host-prefixed directories. More Info
Save File(s) LocationWhere to save the files. If directory(ies) non-existent, one will be created. Default: ./. More Info
Cut Directory(ies)Ignore the number of directory components. Ex: ftp.xemacs.org/pub/xemacs/. With '--cut-dirs=1' it will now save locally as ftp.xemacs.org/xemacs/ More Info

# File & Debug

efficiency
warning
extra
CommandInputDescription
Append Output LogsAppend all messages to a logfile. If logfile does not exist, a new file is created. More Info
DebugTurn on debug output, which contains various information important to the developers of Wget if it does not work properly. More Info
QuietTurn off Wget’s output. More Info
Report SpeedOutput bandwidth (in bits). More Info
Input FileRead URLs from a local or external file. If this function is used, no URL(s) needs be present on the command line. More Info
Config FileSpecify the location of a startup file you wish to use instead of the default one(s). More Info

# HTTP(S)

efficiency
warning
extra
CommandInputDescription
No CacheDisable server-side cache. More Info
Compress FileWget asks the server to compress the file using the gzip compression format. gzip is the only available format. More Info
Adjust ExtensionIf a file of type, 'text/html' is downloaded and the URL does not end with the regexp, ‘\.[Hh][Tt][Mm][Ll]?’, the suffix ‘.html’ will be appended to the local filename. More Info
Ignore Content-LengthIgnore Content-Length header. More Info
Max RedirectsMaximum number of redirects to follow for a resource. Default: 20. More Info
Change User-AgentChange the client's identity in the HTTP server. By default, Wget will identify as Wget/current_version. More Info
Retry on Host ErrorConsider host errors as non-fatal. More Info
No Checking CertificateDo not check the server's certificate against recognizable ones and continue downloading, regardless if the verification fails. More Info
HTTPS OnlyWhile in recursive mode, only follow HTTPS links. More Info

# FTP(S)

efficiency
warning
extra
CommandInputDescription
Don't Remove ListingsDon’t remove the temporary listing files generated by FTP retrievals. Not removing them can be useful for debugging purposes, or when you want to be able to easily check on the contents of remote server directories. More Info
FTP FallbackFallback to FTP if FTPS is not supported by targeted server. More Info

# Recursive

efficiency
warning
extra
CommandInputDescription
RecursionTurn on recursive retrieving. More Info
Depth LevelSpecify how far to go during recursion. Default: 5. More Info
Convert LinksConvert all links a document to make them suitable for local viewing. Links that are not downloaded will use the Internet address Url instead. Ex: https://cookie.com/img.gif. More Info
MirrorTurn on all options suitable for mirroring. This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. More Info
Page RequisitesDownload all files necessary to properly display a given HTML page. This includes inlined images, sounds, and referenced stylesheets. More Info

# Accept/Reject

efficiency
warning
extra
CommandInputDescription
File(s) to AcceptA list of file extensions to download during recursive retrieval. If ‘*’, ‘?’, ‘[’ or ‘]’ is typed, input will be treated as a pattern instead. More Info
File(s) to Reject

Defaults

Download all files except the specified extensions in the list. If ‘*’, ‘?’, ‘[’ or ‘]’ is typed, input will be treated as a pattern instead. More Info
No ParentDo not ever ascend to the parent directory when retrieving recursively. More Info