How many files in a directory is too many? (Downloading data from net)

Performance varies according the the filesystem you're using.

  • FAT: forget it :) (ok, I think the limit is 512 files per directory)
  • NTFS: Althought it can hold 4billion files per folder, it degrades relatively quickly - around a thousand you will start to notice performance issues, several thousand and you'll see explorer appear to hang for quite a while.
  • EXT3: physical limit is 32,000 files, but perf suffers after several thousand files too.

  • EXT4: theoretically limitless

  • ReiserFS, XFS, JFS, BTRFS: these are the good ones for lots of files in a directory as they're more modern and designed to handle many files (the others were designed back in the days when HDDs were measured in MB not GB). Performance is a lot better for lots of files (along with ext4) as they both use a binary search type algorithm for getting the file you want (the others use a more linear one).


I store images for serving by a web server, and I have over 300,000 images in one directory on EXT3. I see no performance issues. Before setting this up, I did tests with 500k images in a directory, and randomly accessing files by name, and there was no significant slowdown with 500k over 10k images in the directory.

The only downside I see is that in order to sync the new ones with a second sever I have to run rsync over the whole directory, and can't just tell it to sync a sub directory containing the most recent thousand or so.