Which software should I use to 'save' a website? [closed]
I want to take a few pages to be able to view offline I wanted some good free software that can save a website for me, allow me to view it offline, and also search the pages for text.
Can anybody reccomend any software?
I like HTTrack. Very flexible yet relatively easy to use.
As much as I like wget, the visual feedback HTTrack gives helps you figure out problems there may be with your mirroring operation, and that saves a lot of time and frustration.
I use wget, which you can get here for Windows, or included with pretty much every *nix out there. The -m
option allows you to mirror a website, though it's always required some fiddling.
Teleport Pro is something I've always seen recommended for this kind of problem on Windows, but I've never used it myself.
Once the files are downloaded you can search their contents using any grep-like tool.
wget is a command line tool that can do this. Once you've downloaded the files you can search them using any file searching utility.