How can I download an entire website?

I want to download a whole website (with sub-sites). Is there any tool for that?


Solution 1:

Try example 10 from here:

wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL
  • –mirror : turn on options suitable for mirroring.

  • -p : download all files that are necessary to properly display a given HTML page.

  • --convert-links : after the download, convert the links in document for local viewing.

  • -P ./LOCAL-DIR : save all the files and directories to the specified directory.

Solution 2:

HTTrack for Linux copying websites in offline mode

httrack is the tool you are looking for.

HTTrack allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure.