Wget is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get, connotative of its primary function. One of its best feature is recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more.
So I decided to download a website for offline use from wget. Here we are downloading a my website to tmp folder
#/usr/bin/wget -r -Nc -mk http://vidyadhards.blogspot.com/
-r Turn on recursive retrieving
-N Turn on time-stamping
-m Create a mirror
-k Convert the links
Download time depends on the website contents.
All the content will get fetch in /tmp/vidyadhar/vidyadhards.blogspot.com directory.
For more options use “man wget”
P.S. Don’t use this for illegal purpose.
Pl comment on
if you find any missing point in here, please let us know in comment section or tweet us at @linuxreaders. To get more articles like this, subscribe to our RSS feeds / Mails.