How to find files that are eating up disk space
Robert Huff
roberthuff at rcn.com
Wed Dec 17 17:59:00 UTC 2008
John Almberg writes:
> > Is there a command line tool that will help me figure out where the
> > problem is?
>
> I should probably have mentioned that what I currently do is run
>
> du -h -d0 /
>
> and gradually work my way down the tree, until I find the
> directory that is hogging disk space. This works, but is not
> exactly efficient.
"-d0" limits the search to the indicated directory; i.e. what
you can see by doing "ls -al /". Not superior to "ls -al /" and
using the Mark I eyeball.
What (I think) you want is "du -x -h /": infinite depth, but do
not cross filesystem mount-points. This is still broken in that it
returns a list where the numbers are in a fixed-width fiend which
are visually distinguished only by the last letter.
Try this:
du -x /
and run the resu;ts through "sort":
sort -nr
and those results through "head":
head -n 20
I have a cron job which does this for /usr and e-mails me the
output every morning. After a few days, weeks at most, I know what
should be on that list ... and what shouldn't and needs
investigating.
Robert Huff
More information about the freebsd-questions
mailing list