How many files in a folder is too many? [closed]

Many years ago I vaguely recall that the Finder in Mac OS X would start having problems if a folder contained 2,000 - 3,000 items or more.

Apple doc says that the HFS Plus file system has a theoretical limit of a 2 billion files per folder all Mac OS X versions.

What is the practical limit?

Will having 10,000 photos in a folder be a problem?


Seems that around 10,000 is safe. However, I've found that if you go a lot higher like 50,000 Finder will never even list the files in the directory when you try to browse it. I suspect this is why a lot of data recovery software will create a new folder every 10,000 files if you're doing file carving in RAW.


You can easily try this yourself by running the following in Terminal

mkdir ~/t
cd ~/t
dd if=/dev/random of=test bs=1024 count=16
for i in {1..10000}; do cp test test.$i; done

to create a folder containing 10'000 files with 16kB each (replace the 16 in the third line with another number for differently sized files).


Answering considering a practical example: I have now 326.000 files in a folder, created by an application that download bits from a server. The files are zipped XML files, and my application extracts XML data from it and store it on a local database.

The application runs from the command line. Everything works fine without any issue but rm * or ls * does not work due to the expansion of the wildcard (error message Argument list too long). Since the files are stored in a temporary folder I can just remove the folder after processing the files.

I didn't try to open the folder with Finder, though. I suspect that could be very slow if possible at all.


There are several limits to consider that have been touched upon by some of the comments:

  • argument length and shell expansion - a simple echo * will bail if the concatenated length of filenames the asterisk expands to hits upon that limit. If running into this snare, often find will be your friend. find . -depth 1 -type f | exec echo {} \; would be a working replacement for the innocent echo * mentioned above, limited to listing files only. (echo to be replaced with the action of your choice)

  • per-program limits to the size of internal data structures used to hold directory contents (finder, all kinds of tools trying to read directory listings).

  • directory lookup cache size. While the filesystem may be able to hold 2.1 billion files within the on-disk structure of a directory, it won't be pleasant to work with that number, and you'd be well advised to introduce some strategy of sorting files into subdirectories if you're dealing in structures of that size. (Hint: those people designing web caching structures did have to deal with that - see Maltzahn/Richardson, Reducing the Disk I/O of Web Proxy Server Caches, Usenix 1999.

To speed up access to frequently used disk structures, file systems are using (memory) cache, and the size of these caches is limited. This is where the sudden penalty for large and less-than-optimal structured directorie starts to hit. Depending on the frequency and intensity of access to these directories, the penalty can be significant.

The 2015 article by Tsai et al., How to get more value from your file system directory cache would probably be one of the easier introductions to the subject.


Apple has a support document related to that:

Maximum number of files (or files and folders) in a folder (all Mac OS X versions)

Up to 2.1 billion (2)