The examples are classified into three sections, because of clarity. The first section is a tutorial for beginners. The second section explains some of the more complex program features. The third section contains advice for mirror administrators, as well as even more complex features (that some would call perverted).
wget http://fly.cc.fer.hr/The response will be something like:
--13:30:45-- http://fly.cc.fer.hr:80/ => `index.html' Connecting to fly.cc.fer.hr:80... connected! HTTP request sent, fetching headers... done. Length: 1,749 [text/html] 0K -> . 13:30:46 (68.32K/s) - `index.html' saved [1749/1749]
wget --tries=45 http://fly.cc.fer.hr/jpg/flyweb.jpg
wget -t 45 -o log http://fly.cc.fer.hr/jpg/flyweb.jpg &The ampersand at the end of the line makes sure that Wget works in the background. To unlimit the number of retries, use `-t inf'.
$ wget ftp://gnjilux.cc.fer.hr/welcome.msg --23:35:55-- ftp://gnjilux.cc.fer.hr:21/welcome.msg => `welcome.msg' Connecting to gnjilux.cc.fer.hr:21... connected! Logging in as anonymous ... Logged in! ==> TYPE I ... done. ==> CWD not needed. ==> PORT ... done. ==> RETR welcome.msg ... done. Length: 1,340 (unauthoritative) 0K -> . 23:35:56 (37.39K/s) - `welcome.msg' saved [1340]
wget ftp://prep.ai.mit.edu/pub/gnu/ lynx index.html
wget -i fileIf you specify `-' as file name, the URLs will be read from standard input.
wget -r -t1 http://www.gnu.ai.mit.edu/ -o gnulog
wget -r -l1 http://www.yahoo.com/
wget -S http://www.lycos.com/
wget -s http://www.lycos.com/ more index.html
wget -P/tmp -l2 ftp://wuarchive.wustl.edu/
wget -r -l1 --no-parent -A.gif http://host/dir/It is a bit of a kludge, but it works. `-r -l1' means to retrieve recursively (See section Recursive Retrieval), with maximum depth of 1. `--no-parent' means that references to the parent directory are ignored (See section Directory-Based Limits), and `-A.gif' means to download only the GIF files. `-A "*.gif"' would have worked too.
wget -nc -r http://www.gnu.ai.mit.edu/
wget ftp://hniksic:mypassword@jagor.srce.hr/.emacs
wget --dot-style=binary ftp://prep.ai.mit.edu/pub/gnu/READMEYou can experiment with other styles, like:
wget --dot-style=mega ftp://ftp.xemacs.org/pub/xemacs/xemacs-20.4/xemacs-20.4.tar.gz wget --dot-style=micro http://fly.cc.fer.hr/To make these settings permanent, put them in your `.wgetrc', as described before (See section Sample Wgetrc).
crontab 0 0 * * 0 wget --mirror ftp://ftp.xemacs.org/pub/xemacs/ -o /home/me/weeklog
wget --mirror -A.html http://www.w3.org/
wget -rN -Dsrce.hr http://www.srce.hr/Now Wget will correctly find out that `regoc.srce.hr' is the same as `www.srce.hr', but will not even take into consideration the link to `www.mit.edu'.
wget -k -r URL
wget -O - http://jagor.srce.hr/ http://www.srce.hr/You can also combine the two options and make weird pipelines to retrieve the documents from remote hotlists:
wget -O - http://cool.list.com/ | wget --force-html -i -
Go to the first, previous, next, last section, table of contents.