Heh.

I knew that something was available, but didn't have time to research it.

Thanks.

Kurt

On Tue, Aug 3, 2010 at 12:19, Rubens Almeida <[email protected]> wrote:
> You can also replace FindStr with native PowerShell CMDLet
> Select-String! I've even created me a nice alias to it suggestively
> called "grep" ;)
>
> On Tue, Aug 3, 2010 at 4:07 PM, Kurt Buff <[email protected]> wrote:
>> I tested this against a small directory, and am now running this:
>>
>> PS K:\> get-childitem k:\groups -force -recurse | format-table
>> creationtime,length,fullname -auto | findstr ^2010-06-2 | findstr /v
>> ^2010-06-20 | findstr /v ^2010-06-21 | findstr /v ^2010-06-22 |
>> findstr /v ^2010-06-23 | findstr /v 2010-06-27 | findstr /v
>> ^2010-06-28 | findstr /v ^2010-06-29 >  out.txt
>>
>> Your hint with 'fullname' was the last piece of the puzzle.
>>
>> I really need to start reading my powershell books - putting them
>> underneath my pillow just isn't cutting it...
>>
>> Need. More. Time.
>>
>> Kurt
>>
>> On Mon, Aug 2, 2010 at 20:52, Rubens Almeida <[email protected]> wrote:
>>> PowerShell... and here's one of my favorites one-liners to find big files:
>>>
>>> dir c:\temp -force -recurse | sort length -desc | format-table
>>> creationtime,lastwritetime,lastaccesstime,length,fullname -auto
>>>
>>> You can sort the results replacing the length by any of the properties
>>> after format-table
>>>
>>> On Mon, Aug 2, 2010 at 9:48 PM, Kurt Buff <[email protected]> wrote:
>>>> All,
>>>>
>>>> On our file server we have a single 1.5tb partition - it's on a SAN.
>>>> Over the course of 4 days recently it went from about 30% free to
>>>> about 13% free - someone slammed around 200gb onto the file server.
>>>>
>>>> I have a general idea of where it might be - there are two top-level
>>>> directories that are over 200gb each.
>>>>
>>>> However, windirstat hasn't been completely helpful, as I can't seem to
>>>> isolate which files were loaded during those days, and none of the
>>>> files that I've been looking at were huge - no ISO or VHD files worth
>>>> mentioning, etc..
>>>>
>>>> I also am pretty confident that there are a *bunch* of duplicate files
>>>> on those directories.
>>>>
>>>> So, I'm looking for a couple of things:
>>>>
>>>> 1) A way to get a directory listing that supports a time/date stamp
>>>> (my choice of atime, mtime or ctime) size and a complete path name for
>>>> each file/directory on a single line - something like:
>>>>
>>>>     2009-01-08  16:12   854,509
>>>> K:\Groups\training\On-Site_Special_Training\Customer1.doc
>>>>
>>>> I've tried every trick I can think of for the 'dir' command and it
>>>> won't do what I want, and the 'ls' command from gunuwin32 doesn't seem
>>>> to want to do this either. Is there a powershell one-liner that can do
>>>> this for me perhaps?
>>>>
>>>> 2) A recommendation for a duplicate file finder - cheap or free would
>>>> be preferred.
>>>>
>>>> Kurt
>>>>
>>>> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
>>>> ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~
>>>>
>>>
>>> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
>>> ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~
>>>
>>>
>>
>> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
>> ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~
>>
>>
>
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/>  ~

Reply via email to