download webpage and dependencies, including css images [closed]
Often I need to download a webpage and then edit it offline. I have tried a few tools and the main feature they lack is downloading images referenced in the CSS files.
Is there a tool (for Linux) that will download everything so that the webpage will render the same offline (excluding AJAX)?
Solution 1:
wget --page-requisites http://example.com/your/page.html
This option causes Wget to download all the files that are necessary to properly display a given html page. This includes such things as inlined images, sounds, and referenced stylesheets.
EDIT: meder is right: stock wget does not parse and download css images. There is, however, a patch that adds this feature: [1, 2]
UPDATE: The patch mentioned above has been merged into wget 1.12, released 22-Sep-2009:
** Added support for CSS. This includes:
- Parsing links from CSS files, and from CSS content found in HTML
style tags and attributes.
- Supporting conversion of links found within CSS content, when
--convert-links is specified.
- Ensuring that CSS files end in the ".css" filename extension,
when --convert-links is specified.
Solution 2:
It's possible to do this through Firefox, see this form
- Right click
- View page info
- Select media tab
- Highlight all files
- Save as
Reference - http://www.webdeveloper.com/forum/showthread.php?t=212610