Everyone:

Thank you so much for the help! I ended up starting with Treesize Free, but it 
doesn’t parse network shares in the free edition. So I turned to Daniel’s 
suggestion and the ‘Get-FolderSize’ was able to show us the root share that was 
growing the most. I was away for a while and saw the Treesize Pro trial 
suggestion. I ran it against the root of the growing share and saw 750GB of 
data created in the last 10 days. In the end the issue stems from a recent 
application upgrade causing *.dmp files to dump to a redirected folder path. 
While we don’t have a fix for the application issue, we can likely clear out 
the offending directories from each user’s folder:

$List = Get-ChildItem "\\Share\Folder" -Directory
Foreach($UserDir in $List)
{
    $Location = "\\Share\Folder\$UserDir\Offending_Folder"
    If(Test-Path $Location)
    {
        Remove-Item "$Location\*.dmp"
    }
}

Thank you again everyone!! You really helped to shorten the time to resolve for 
us.

-Geoff

From: [email protected] [mailto:[email protected]] On 
Behalf Of Devin Rich
Sent: Friday, March 18, 2016 2:27 PM
To: [email protected]
Subject: Re: [powershell] RE: Parsing Large Directories:

+1 for Treesize pro with saving snapshots of a share so I can compare the 
growth trend over time.

Thanks,

Devin Rich
Systems Administrator
First Electronic Bank
801-576-4408

On Fri, Mar 18, 2016 at 3:07 PM, Scott Crawford 
<[email protected]<mailto:[email protected]>> wrote:
Treesize pro has an awesome feature we've used for the OPs problem. After it 
analyzes a folder, you can save the analysis to compare to a future state. It 
will show you exactly where the differences in size lie.

Sent from my Windows Phone
________________________________
From: Jonathan Raper<mailto:[email protected]>
Sent: ‎3/‎18/‎2016 4:05 PM
To: [email protected]<mailto:[email protected]>
Subject: [powershell] RE: Parsing Large Directories:
+100 for TreeSize. The pro version comes in a 30 day trial that is fully 
functional and offers more robust features if TreeSize Free doesn’t get you 
what you need.

Thanks,


[cid:[email protected]]


Jonathan L Raper, A+, MCSA, MCSE, FCC Licensed Technician, VCA-DCV, VCA-Cloud
Senior Solutions Engineer
[cid:[email protected]]Corporation
336.232.5244 Cisco Single Number Reach
7025 Albert Pick Road, Suite 302, Greensboro, NC 27409
www.NWNIT.com<http://www.NWNIT.com>

[cid:[email protected]]

      NWN helps customers solve business problems through technology
     [cid:[email protected]] 
<https://www.facebook.com/NWNCorporation>    
[cid:[email protected]] 
<https://www.linkedin.com/company/nwn-corporation?trk=company_logo>   
[cid:[email protected]] <https://twitter.com/nwncorporation>

From: [email protected]<mailto:[email protected]> 
[mailto:[email protected]<mailto:[email protected]>] 
On Behalf Of Michael B. Smith
Sent: Friday, March 18, 2016 3:58 PM
To: [email protected]<mailto:[email protected]>
Subject: [powershell] RE: Parsing Large Directories:

Go old school

Treesize Free.

From: [email protected]<mailto:[email protected]> 
[mailto:[email protected]] On Behalf Of Orlebeck, Geoffrey
Sent: Friday, March 18, 2016 3:36 PM
To: '[email protected]<mailto:[email protected]>'
Subject: [powershell] Parsing Large Directories:

We have a unique situation where one of our NAS appliances is showing ~3GB/hr 
growth for the last ~7days. We recently lost our SAN/NAS administrator, so the 
rest of us are a bit unsure how to proceed. My initial thought as a cursory 
check was to try (quickly) to get an idea of the root shares sizes and check 
which ones are growing the fastest, then try to narrow focus onto them. I 
understand this may require going to the .NET level and not using simple 
Get-ChildItem, etc. due to the Path character limit.

I’m just trying to figure out the best way to (as quickly as possible) obtain a 
basic folder size at the root of some shares and then re-run that check every 
one or two or three hours to determine where the growth is. We have a support 
case open with the NAS vendor, but in the meantime the current growth would 
mean we run out of space in ~10 days. Just looking for some ideas on how to 
accomplish this.

We have an account with access to the root share that all others come from, so 
we can crawl all of them or individual ones in separate jobs/threads. But 
again, it’s a little beyond my current knowledge level with PowerShell.

Any helpful thoughts/tips are appreciated.
Confidentiality Notice: This is a transmission from Community Hospital of the 
Monterey Peninsula. This message and any attached documents may be confidential 
and contain information protected by state and federal medical privacy 
statutes. They are intended only for the use of the addressee. If you are not 
the intended recipient, any disclosure, copying, or distribution of this 
information is strictly prohibited. If you received this transmission in error, 
please accept our apologies and notify the sender. Thank you.

================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1

================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1
NOTE: This message and any attachments is intended solely for the use of the 
individual or entity to which it is addressed and may contain information that 
is non-public, proprietary, legally privileged, confidential, and/or exempt 
from disclosure. If you are not the intended recipient, you are hereby notified 
that any use, dissemination, distribution, or copying of this communication is 
strictly prohibited. If you have received this communication in error, please 
notify the original sender immediately by telephone or return email and destroy 
or delete this message along with any attachments immediately.
================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1

================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1


The information contained in this message is privileged, confidential, and 
protected from disclosure. If you are not the intended recipient, you are 
hereby notified that any review, printing, dissemination, distribution, copying 
or other use of this communication is strictly prohibited. If you have received 
this communication in error, please notify us immediately by replying to the 
message and deleting it from your computer.
================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1
Confidentiality Notice: This is a transmission from Community Hospital of the 
Monterey Peninsula. This message and any attached documents may be confidential 
and contain information protected by state and federal medical privacy 
statutes. They are intended only for the use of the addressee. If you are not 
the intended recipient, any disclosure, copying, or distribution of this 
information is strictly prohibited. If you received this transmission in error, 
please accept our apologies and notify the sender. Thank you.

================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1

Reply via email to