The website claims that the "net-result is something similar to KDirStat, however the data is more dense, and the representation more informative."
--Thomas
Aaron Bockover writes:
Nautilus gave me a shocking message a few moments ago: not enough disk space.
A quick df -h showed that my primary volume was indeed 100% used. Sad really. I am a data hoarder. It's compulsive.
My question is this: is there a shell tool or script in existence that can be run on a given base directory and calculate the total size of the *files* in a directory (not recursively), and report those larger than some limit? Also something to report single large files larger than some limit?
If a tool like this exists, it might help me reduce some of my clutter. I know I should probably get rid of those massive ISOs from five years ago, but what if I need RH 6 next week?! I'm trying to avoid that route.
If something like this doesn't exist, I think I may have to write one. I'd love to hear thoughts on how others manage their heaps of data. Fortunately most of mine is somewhat organized, but organization comes in phases... dig through data -- organize data -- collect data -- realize data is unorganized -- repeat.
Regards, Aaron Bockover
-- TriLUG mailing list : http://www.trilug.org/mailman/listinfo/trilug TriLUG Organizational FAQ : http://trilug.org/faq/ TriLUG Member Services FAQ : http://members.trilug.org/services_faq/ TriLUG PGP Keyring : http://trilug.org/~chrish/trilug.asc
-- TriLUG mailing list : http://www.trilug.org/mailman/listinfo/trilug TriLUG Organizational FAQ : http://trilug.org/faq/ TriLUG Member Services FAQ : http://members.trilug.org/services_faq/ TriLUG PGP Keyring : http://trilug.org/~chrish/trilug.asc
