cheat-fork-echo/cheat/cheatsheets/wget
Daniel D. Zhang aba6fe5043 fix typo
reference: 
-O file
--output-document=file
    The documents will not be written to the appropriate files, but all will be concatenated together and
    written to file.  If - is used as file, documents will be printed to standard output, disabling link
    conversion.  (Use ./- to print to a file literally named -.)

    Use of -O is not intended to mean simply "use the name file instead of the one in the URL;" rather,
    it is analogous to shell redirection: wget -O file http://foo is intended to work like wget -O -
    http://foo > file; file will be truncated immediately, and all downloaded content will be written
    there.

    For this reason, -N (for timestamp-checking) is not supported in combination with -O: since file is
    always newly created, it will always have a very new timestamp. A warning will be issued if this
    combination is used.

    Similarly, using -r or -p with -O may not work as you expect: Wget won't just download the first file
    to file and then download the rest to their normal names: all downloaded content will be placed in
    file. This was disabled in version 1.11, but has been reinstated (with a warning) in 1.11.2, as there
    are some cases where this behavior can actually have some use.

    Note that a combination with -k is only permitted when downloading a single document, as in that case
    it will just convert all relative URIs to external ones; -k makes no sense for multiple URIs when
    they're all being downloaded to a single file; -k can be used only when the output is a regular file.
2015-05-10 13:54:49 +08:00

36 lines
1.0 KiB
Plaintext

# To download a single file
wget http://path.to.the/file
# To download a file and change its name
wget http://path.to.the/file -O newname
# To download a file into a directory
wget -P path/to/directory http://path.to.the/file
# To continue an aborted downloaded
wget -c http://path.to.the/file
# To download multiples files with multiple URLs
wget URL1 URL2
# To parse a file that contains a list of URLs to fetch each one
wget -i url_list.txt
# To mirror a whole page locally
wget -pk http://path.to.the/page.html
# To mirror a whole site locally
wget -mk http://site.tl/
# To download files according to a pattern
wget http://www.myserver.com/files-{1..15}.tar.bz2
# To download all the files in a directory with a specific extension if directory indexing is enabled
wget -r -l1 -A.extension http://myserver.com/directory
# Allows you to download just the headers of responses (-S --spider) and display them on Stdout (-O -).
wget -S --spider -O - http://google.com
# Change the User-Agent to 'User-Agent: toto'
wget -U 'toto' http://google.com