How to identify the largest files in a directory including in its subdirectories?

Here's an example using BSD find and stat-

find . -type f -exec stat -f '%z %N' {} + | sort -nr

I like spotlight better than crawling the filesystem for speed reasons. This doesn't directly report sizes so you can sort top 10, but you can quickly (fraction of a second) locate places where large files exist.

mdfind 'kMDItemFSSize > 2000000000' | grep "$(pwd)"

More options are on this thread. Add or remove zeroes or pick your size threshold for finding large files.


As to your syntax errors and other problems you'll face is handling spaces in files and directories, hence the " " around $(pwd) so you can quickly get in territory where you need to learn a bit more about shells, pipes, white space and escaping words / variables. Worst case, isolate each one element above so you can learn what the output of the find command is telling you and they layer back the filters / processing once you have valid results.

If all you are after cleaning large files and you don't need a script and want to learn that later. I recommend system information, storage management or Daisy Disk. Finder also can calculate all sizes well for me in preferences if you want to interact as you navigate.


BSD find (which is part of macOS) and GNU find (which is part of Linux distributions) use different options, so your examples won't work.

  • You can install GNU find via Homebrew (brew install findutils) and then use gfind to get GNU syntax/options
  • For a quick solution, du -a TOP-DIRECTORY | sort -r will work as well. It lists both files and directories though, so it won't easily work within a script

If you're looking for the largest file(s) in each directory and child subdirectories

Use the du command:

du -ah . | sort -nr | head -n X

Where X is the number of results you want. For example, if you want the top 25, substitute 25 for X. If you only want the largest file, use 1. It will list the output in "human readable format."