PowerShell... and here's one of my favorites one-liners to find big files:

dir c:\temp -force -recurse | sort length -desc | format-table
creationtime,lastwritetime,lastaccesstime,length,fullname -auto

You can sort the results replacing the length by any of the properties
after format-table

On Mon, Aug 2, 2010 at 9:48 PM, Kurt Buff <[email protected]> wrote:
> All,
>
> On our file server we have a single 1.5tb partition - it's on a SAN.
> Over the course of 4 days recently it went from about 30% free to
> about 13% free - someone slammed around 200gb onto the file server.
>
> I have a general idea of where it might be - there are two top-level
> directories that are over 200gb each.
>
> However, windirstat hasn't been completely helpful, as I can't seem to
> isolate which files were loaded during those days, and none of the
> files that I've been looking at were huge - no ISO or VHD files worth
> mentioning, etc..
>
> I also am pretty confident that there are a *bunch* of duplicate files
> on those directories.
>
> So, I'm looking for a couple of things:
>
> 1) A way to get a directory listing that supports a time/date stamp
> (my choice of atime, mtime or ctime) size and a complete path name for
> each file/directory on a single line - something like:
>
>     2009-01-08  16:12   854,509
> K:\Groups\training\On-Site_Special_Training\Customer1.doc
>
> I've tried every trick I can think of for the 'dir' command and it
> won't do what I want, and the 'ls' command from gunuwin32 doesn't seem
> to want to do this either. Is there a powershell one-liner that can do
> this for me perhaps?
>
> 2) A recommendation for a duplicate file finder - cheap or free would
> be preferred.
>
> Kurt
>
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~

Reply via email to