How do you archive an entire website for offline viewing?
We actually have burned static/archived copies of our asp.net websites for customers many times. We have used WebZip until now but we have had endless problems with crashes, downloaded pages not being re-linked correctly, etc.
We basically need an application that crawls and downloads static copies of everything on our asp.net website (pages, images, documents, css, etc) and then processes the downloaded pages so that they can be browsed locally without an internet connection (get rid of absolute urls in links, etc). The more idiot proof the better. This seems like a pretty common and (relatively) simple process but I have tried a few other applications and have been really unimpressed
Does anyone have archive software they would recommend? Does anyone have a really simple process they would share?
You could use wget:
wget -m -k -K -E http://url/of/web/site
In Windows, you can look at HTTrack. It's very configurable allowing you to set the speed of the downloads. But you can just point it at a website and run it too with no configuration at all.
In my experience it's been a really good tool and works well. Some of the things I like about HTTrack are:
- Open Source license
- Resumes stopped downloads
- Can update an existing archive
- You can configure it to be non-aggressive when it downloads so it doesn't waste your bandwidth and the bandwidth of the site.