Website Backup and Download

I've found httrack (http://www.httrack.com/) very useful for this in the past.

If you use any tool to try download an entire site (not just httrack), make sure you show a little consideration to the site. See httrack's "what not to do" page for some pointers on that.


you can use wget to mirror the website [ provided it does not have flash or javascript based navigation ].

look here or just check command's manual. wget is available for unix systems and windows.


wget I believe will crawl a page for you

the option -r I believe is what you want. Note in the following snip the part about converting links for offline viewing. Since you said you want to have this page just in case it "goes up in flames" this will allow you to browse it locally.

from the man page

Wget can follow links in HTML and XHTML pages and create local versions of remote web
sites, fully recreating the directory structure of the original site.  This is sometimes 
referred to as "recursive downloading."  While doing that, Wget respects the Robot 
Exclusion Standard (/robots.txt).  Wget can be instructed to convert the links in 
downloaded HTML files to the local files for offline viewing.  

Tags:

Backup