That's a nice one-liner Michael! Another nice trick to my PoSh black book! On Tue, Aug 3, 2010 at 4:22 PM, Michael B. Smith <[email protected]> wrote: > get-childitem k:\groups -force -recurse |? {$_.CreationTime.ToString() -match > "^2010-06-2[0-9]" } | format-table creationtime,length,fullname -auto > > Or select-string. > > No need to drop to findstr. > > Regards, > > Michael B. Smith > Consultant and Exchange MVP > http://TheEssentialExchange.com > > > -----Original Message----- > From: Kurt Buff [mailto:[email protected]] > Sent: Tuesday, August 03, 2010 3:07 PM > To: NT System Admin Issues > Subject: Re: Finding a huge file dump from June... > > I tested this against a small directory, and am now running this: > > PS K:\> get-childitem k:\groups -force -recurse | format-table > creationtime,length,fullname -auto | findstr ^2010-06-2 | findstr /v > ^2010-06-20 | findstr /v ^2010-06-21 | findstr /v ^2010-06-22 | findstr /v > ^2010-06-23 | findstr /v 2010-06-27 | findstr /v > ^2010-06-28 | findstr /v ^2010-06-29 > out.txt > > Your hint with 'fullname' was the last piece of the puzzle. > > I really need to start reading my powershell books - putting them underneath > my pillow just isn't cutting it... > > Need. More. Time. > > Kurt > > On Mon, Aug 2, 2010 at 20:52, Rubens Almeida <[email protected]> wrote: >> PowerShell... and here's one of my favorites one-liners to find big files: >> >> dir c:\temp -force -recurse | sort length -desc | format-table >> creationtime,lastwritetime,lastaccesstime,length,fullname -auto >> >> You can sort the results replacing the length by any of the properties >> after format-table >> >> On Mon, Aug 2, 2010 at 9:48 PM, Kurt Buff <[email protected]> wrote: >>> All, >>> >>> On our file server we have a single 1.5tb partition - it's on a SAN. >>> Over the course of 4 days recently it went from about 30% free to >>> about 13% free - someone slammed around 200gb onto the file server. >>> >>> I have a general idea of where it might be - there are two top-level >>> directories that are over 200gb each. >>> >>> However, windirstat hasn't been completely helpful, as I can't seem >>> to isolate which files were loaded during those days, and none of the >>> files that I've been looking at were huge - no ISO or VHD files worth >>> mentioning, etc.. >>> >>> I also am pretty confident that there are a *bunch* of duplicate >>> files on those directories. >>> >>> So, I'm looking for a couple of things: >>> >>> 1) A way to get a directory listing that supports a time/date stamp >>> (my choice of atime, mtime or ctime) size and a complete path name >>> for each file/directory on a single line - something like: >>> >>> 2009-01-08 16:12 854,509 >>> K:\Groups\training\On-Site_Special_Training\Customer1.doc >>> >>> I've tried every trick I can think of for the 'dir' command and it >>> won't do what I want, and the 'ls' command from gunuwin32 doesn't >>> seem to want to do this either. Is there a powershell one-liner that >>> can do this for me perhaps? >>> >>> 2) A recommendation for a duplicate file finder - cheap or free would >>> be preferred. >>> >>> Kurt >>> >>> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ >>> <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~ >>> >> >> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ >> <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~ >> >> > > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ > <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~ > > > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ > ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~
~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~
