Several posters have already given you a number of suggestions.
What I would suggest also, is to throw together a quick awk or perl
script to run your favorite, parse the output, compare to a
"threshold" value, and send anything greater in an email to yourself.
Then run this each night from cron.  You could also have the script
check the df output first and only continue if less than 20% left,
then add parameters to a find command, to report only the largest
files older than a certain date.  Makes a good reminder to either
clean up or archive files.

On Thursday 17 March 2005 01:48 am, Aaron Bockover wrote:
Nautilus gave me a shocking message a few moments ago: not enough
disk space.

A quick df -h showed that my primary volume was indeed 100% used.
Sad really. I am a data hoarder. It's compulsive.

My question is this: is there a shell tool or script in existence
that can be run on a given base directory and calculate the total
size of the *files* in a directory (not recursively), and report
those larger than some limit? Also something to report single large
files larger than some limit?

-- Scott G. Hall Raleigh, NC, USA [EMAIL PROTECTED] -- TriLUG mailing list : http://www.trilug.org/mailman/listinfo/trilug TriLUG Organizational FAQ : http://trilug.org/faq/ TriLUG Member Services FAQ : http://members.trilug.org/services_faq/ TriLUG PGP Keyring : http://trilug.org/~chrish/trilug.asc

Reply via email to