How can I determine what is taking up so much space? [duplicate]
I was using df -h to print out human readable disk usage. I would like to figure out what is taking up so much space. For instance, is there a way to pipe this command so that it prints out files that are larger than 1GB in size? Other ideas?
Thanks
I use this one a lot.
du -kscx *
It can take a while to run, but it'll tell you where the disk space is being used.
You may want to try the ncdu
utility found at: http://dev.yorhel.nl/ncdu
It will quicky sum the contents of a filesystem or directory tree and print the results, sorted by size. It's a really nice way to drill-down interactively and see what's consuming drive space.
Additionally, it can be faster than some du
combinations.
The typical output looks like:
ncdu 1.7 ~ Use the arrow keys to navigate, press ? for help
--- /data ----------------------------------------------------------------------------------------------------------
163.3GiB [##########] /docimages
84.4GiB [##### ] /data
82.0GiB [##### ] /sldata
56.2GiB [### ] /prt
40.1GiB [## ] /slisam
30.8GiB [# ] /isam
18.3GiB [# ] /mail
10.2GiB [ ] /export
3.9GiB [ ] /edi
1.7GiB [ ] /io
1.2GiB [ ] /dmt
896.7MiB [ ] /src
821.5MiB [ ] /upload
691.1MiB [ ] /client
686.8MiB [ ] /cocoon
542.5MiB [ ] /hist
358.1MiB [ ] /savsrc
228.9MiB [ ] /help
108.1MiB [ ] /savbin
101.2MiB [ ] /dm
40.7MiB [ ] /download
You can use the find command. Example:
find /home/ -size +1073700000c -print
I myself use
du -c --max-depth=4 /dir | sort -n
this returns amount of space used by a directory and its subdirectories up to 4 deep, sort -n
will put the largest last.
New versions of sort
can handle "human-readable" sizes, so one can use much more readable
du -hc --max-depth=4 /dir | sort -h