Download a zip archive:
and archive.zip will be downloaded. But we can download files using a lot of parameters. Read on!
This download all files recursively: images, html files, etc. But this could get us banned by the server for sending too many download requests so to avoid this:
wget -r http://website.com
--random-wait means to download a file and then wait for a random period of time, then download the next file and so on.
wget --random-wait --limit-rate=20k -r http://website.com
--limit-rate=20k indicates that you want to download at a maximum speed of 20k so you don't get banned.
Or you could also do:
--wait=20 to wait 20 seconds between each file download, but I think it's better to download with --random-wait
wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://website.com
-p indicates that the files should be displayed as HTML, as if you were actually looking at the page
-U Mozilla will make the website believe you are using a Mozilla browser.
And here is how to download all images, videos or whatever you want, from a website:
With this command, you download all jpg and png files from website.com. If you want to download all mp3s, then you would use -A=.mp3
wget -r -A=.jpg,.png http://website.com
You can also use a GUI for wget if you want. It's called Gwget and should be in your distribution repositories. For Ubuntu, do:
sudo apt-get install gwget