Recursive download (`wget -r`) equivalent for Firefox?

Solution 1:

DownThemAll doesn't do recursive downloading. It only grabs links from the current page. Only the HTML page of the linked URLS are downloaded. The linked pages' images and media are not downloaded.

Solution 2:

SpiderZilla is meant to do that -- but, its a bit old (2006).
It is based on HTTrack website copier.
Which has updated versions for all platforms.

There is also another older addon that can let you plug-in 'wget' itself (among other things).

However, I too feel that DownThemAll is probably a good choice.
If you know what you want to mirror, selecting the right links should not be a problem.
And, you can always tick the 'All' checkbox.

So, +1, for DownThemAll if you want to stick to the browser.
And, use HTTrack if you want a standalone tool (and wget is not handy).

Update: you may also want to look at HTTrack's votes at this bounty question,
How can I download an entire website.

Solution 3:

You can use wget -r with cookies from browser, extracted after authorization.

Firefox has "Copy as cURL" option in the context menu of the page request in the Network tab of Web Developer Tools, hotkey Ctrl+Shift+Q (you may need to reload the page after opening the tools): screenshot

Replace curl's header flag -H with wget's --header, and you have all needed headers, including cookies, to continue the browser session with wget.