cheat-fork-echo/cheat/cheatsheets/wget

36 lines
1.0 KiB
Plaintext
Raw Normal View History

# To download a single file
wget http://path.to.the/file
# To download a file and change its name
fix typo reference: -O file --output-document=file The documents will not be written to the appropriate files, but all will be concatenated together and written to file. If - is used as file, documents will be printed to standard output, disabling link conversion. (Use ./- to print to a file literally named -.) Use of -O is not intended to mean simply "use the name file instead of the one in the URL;" rather, it is analogous to shell redirection: wget -O file http://foo is intended to work like wget -O - http://foo > file; file will be truncated immediately, and all downloaded content will be written there. For this reason, -N (for timestamp-checking) is not supported in combination with -O: since file is always newly created, it will always have a very new timestamp. A warning will be issued if this combination is used. Similarly, using -r or -p with -O may not work as you expect: Wget won't just download the first file to file and then download the rest to their normal names: all downloaded content will be placed in file. This was disabled in version 1.11, but has been reinstated (with a warning) in 1.11.2, as there are some cases where this behavior can actually have some use. Note that a combination with -k is only permitted when downloading a single document, as in that case it will just convert all relative URIs to external ones; -k makes no sense for multiple URIs when they're all being downloaded to a single file; -k can be used only when the output is a regular file.
2015-05-10 07:54:49 +02:00
wget http://path.to.the/file -O newname
# To download a file into a directory
wget -P path/to/directory http://path.to.the/file
2013-11-03 20:19:27 +01:00
# To continue an aborted downloaded
wget -c http://path.to.the/file
# To download multiples files with multiple URLs
wget URL1 URL2
2013-11-25 20:07:32 +01:00
# To parse a file that contains a list of URLs to fetch each one
wget -i url_list.txt
# To mirror a whole page locally
wget -pk http://path.to.the/page.html
# To mirror a whole site locally
2014-03-06 14:20:52 +01:00
wget -mk http://site.tl/
2014-03-06 14:21:56 +01:00
# To download files according to a pattern
wget http://www.myserver.com/files-{1..15}.tar.bz2
# To download all the files in a directory with a specific extension if directory indexing is enabled
wget -r -l1 -A.extension http://myserver.com/directory
2013-11-25 20:07:32 +01:00
# Allows you to download just the headers of responses (-S --spider) and display them on Stdout (-O -).
wget -S --spider -O - http://google.com
2014-07-29 18:01:04 +02:00
# Change the User-Agent to 'User-Agent: toto'
wget -U 'toto' http://google.com