I've got to get a directory listing that contains about 2 million files, but when I do an ls command on it nothing comes back. I've waited 3 hours. I've tried ls | tee directory.txt, but that seems to hang forever.

I assume the server is doing a lot of inode sorting. Is there any way to speed up the ls command to just get a directory listing of filenames? I don't care about size, dates, permission or the like at this time.


ls -U

will do the ls without sorting.

Another source of slowness is --color. On some linux machines, there is a convenience alias which adds --color=auto' to the ls call, making it look up file attributes for each file found (slow), to color the display. This can be avoided by ls -U --color=never or \ls -U.


I have a directory with 4 million files in it and the only way I got ls to spit out files immediately without a lot of churning first was

ls -1U

Try using:

find . -type f -maxdepth 1

This will only list the files in the directory, leave out the -type f argument if you want to list files and directories.