downloading complete web pages (not sites)
Have you tried wget --page-requisites
?
This option causes Wget to download all the files that are neces‐
sary to properly display a given HTML page. This includes such
things as inlined images, sounds, and referenced stylesheets.
You should use a firefox extension: ScrapBook https://addons.mozilla.org/en-US/firefox/addon/427
ScrapBook is a Firefox extension, which helps you to save Web pages and easily manage collections. Key features are lightness, speed, accuracy and multi-language support. Major features are:
- Save Web page
- Save snippet of Web page
- Save Web site
- Organize the collection in the same way as Bookmarks
- Full text search and quick filtering search of the collection
- Editing of the collected Web page
- Text/HTML edit feature resembling Opera's Notes
no scripting solution, but I use Scrapbook to archive site for later reading. Its a wonderfull extension!
If you have an iPhone, then you can use a service called Instapaper. It allows you to bookmark pages for reading later (uses a small bookmarklet in your browser). Once bookmarked, the pages can then be sync'd to your iPhone over the air (Wifi or Cellular) from the Instapaper servers. Once the app has completed syncing, all the data is stored locally on the iPhone.
As an added bonus, the Instapaper servers process the page and can serve the text-only version of the page (also can do the graphical version) which can be easier for reading.
I use the service and find it excellent for filling in my hour long commute on the train.