How can I view all files in a website's directory?

Is it possible to list all files and directories in a given website's directory from the Linux shell?

Something similar to:

ls -l some_directory

but instead of some_directory, it would be ls -l http://www.some_site.com/some_directory/. Obviously, the latter will not work.


Solution 1:

I was just wondering the same thing. The following is probably not the most efficient solution, but it seems to work. It recreates the directory structure of the webserver locally. (Found the first command via stackoverflow)

wget --spider -r --no-parent http://some.served.dir.ca/
ls -l some.served.dir.ca

Solution 2:

Yes, it is possible. Sometimes.

When you browse to a webpage (Say to http://demo.domain.tld/testdir/index.html) it will open the file you specified (in this case `index.html).

If you do not specify a file and there is a default present (e.g. the web-server is configured to show index.html, or index.php, ...) and you typed http://demo.domain.tld/testdir/ then it will automagically present you with the right file.

If that file is not present then it can do other things, such as listing the directory contents. This is quite useful when building a site, however it is also considered unwise from a security standpoint.

TL;DR: Yes, sometimes it is possible.

However the more practical approach is to simply SSH, [s]FTP or RDP to the web-server and issue a local directory listing command.