Copy all files from an HTTP folder location to a local folder

I'm a student and need to pull down lots of stuff from my professor’s website, preferably retaining some of the folder structure.

I'm working on Windows boxes and have access to Windows XP, Windows 7, and Windows Server 2008 R2. Way back in the day (2-3 years ago) I tried some utilities that mirrored web pages and that sort of thing and for varying reasons they never worked right, or I could never get what I wanted from them.

So, for example, these folders:

http://myUniversity.edu/professor/classLectures/folder1/programmaticFolderABCXYZ

http://myUniversity.edu/professor/classLectures/folder1/programmaticFolder123456

http://myUniversity.edu/professor/classLectures/folder1/programmaticFolder4321

http://myUniversity.edu/professor/classLectures/folder1/programmaticFolder2345

http://myUniversity.edu/professor/classLectures/folder2/programmaticFolderABCXYZ2

http://myUniversity.edu/professor/classLectures/folder2/programmaticFolder1234563

http://myUniversity.edu/professor/classLectures/folder2/programmaticFolder43214

http://myUniversity.edu/professor/classLectures/folder2/programmaticFolder23455

In essence, a real pain to try to download manually for later use.

I've tried this utility and either it's overkill, or not-simple-enough-kill because I could never get it to just download files to my hard drive.

Ideally, I'd like to recursively scan the folder, recreate the folder structure in some specified folder, then copy the files from the remote server to their corresponding folder on my local machine.


The simplest utility to download the files from web site recursively is WGET:

http://gnuwin32.sourceforge.net/packages/wget.htm


Firefox addon: DownThemAll!

Chrome extension: GetThemAll