I've always assumed that this happens via one of two scenarios.  I don't 
believe I've ever validated this so maybe this is a good opportunity for 
feedback from the group.

1. They map a drive to a share on the server and then create a file which is 
perfectly valid on their system, but the server path is longer.

2. A folder/tree gets moved to somewhere else and the subtree results in paths 
which exceed the limit.

--
There are 10 kinds of people in the world...
         those who understand binary and those who don't.

-----Original Message-----
From: [email protected] [mailto:[email protected]] On 
Behalf Of Kurt Buff
Sent: Thursday, January 7, 2016 11:47 PM
To: [email protected]
Subject: Re: [powershell] Long file names - again...

I have no freaking idea how they do it. They're users, FFS, and who knows what 
goes through their pea brains. I have yet to catch them in the act.

At any rate, thanks for this. I'll give it a whirl, and see if it works for me 
- as noted, I've got somewhere north of 10m files.

BTW - I tried AlphaFS, and failed miserably with it - couldn't even get it to 
do a directory listing - and then found in the wiki for it that it doesn't 
support file hashing - it's on the feature request list.

There's another module called Zeta Long Paths, but it also says it has limited 
functionality, so I hold out no hope for it.

I've looked at .NET CORE, and yes, it's not for the typical ignorant sysadmin 
like me.

As an aside, I'm using powershell and vbscript already to generate a list of 
all file names on the server, and then to output file/directory names longer 
than 250 characters, and pestering users to shorten them, and in some cases to 
flatten the directory structures. That's (as expected) a very long process.

Kurt

On Thu, Jan 7, 2016 at 5:23 PM, Michael B. Smith <[email protected]> wrote:
> How in the heck did you create filenames longer than MAX_PATH? I've tried 
> everything I can think of, short of writing a C++ program that directly 
> connects to the NTFS FS.
>
> Anyway, this does much of what you want, with that exception. It takes a LONG 
> time for large numbers of files. That is, a million or more.
>
> [ CmdletBinding() ]
>
> Param(
>         [string] $item = 'C:\'
> )
>
> function timer
> {
>         ( Get-Date ).ToLongTimeString() }
>
> function wv
> {
>         Param(
>                 [string] $str
>         )
>
>         Write-Verbose "$( timer ) $str"
> }
>
> ## we only need to create this once
> $md5 = New-Object 
> System.Security.Cryptography.MD5CryptoServiceProvider
>
> function GetMyHash
> {
>         Param(
>                 [object] $fileObject
>         )
>
>         $function = 'GetMyHash:'
>
>         wv "$function begin, fileobject name '$( $fileObject.FullName )'"
>
>         if( $fileObject.NameLength -ge 256 )
>         {
>                 ## here you should write a helper function!!!
>                 Write-Error "$function, filename too long '$( 
> $fileObject.FullName )'"
>         }
>         else
>         {
>                 # you can compress this to a single line, but if you do that, 
> you
>                 # can't clean up resources. if you are doing hundreds of 
> thousands
>                 # or millions of files -- you gotta clean up
>
>                 # $md5 is declared at the script level
>
>                 try
>                 {
>                         [System.IO.FileStream] $file = 
> [System.IO.File]::Open( $fileObject.FullName, 'Open', 'Read' )
>                         $fileObject.Hash = [BitConverter]::ToString( 
> $md5.ComputeHash( $file ) )
>
>                         $file.Close()
>                         $file.Dispose()
>                         $file = $null
>                 }
>                 catch
>                 {
>                 }
>
>                 # $md5.Clear() --- don't do this
>         }
>
>         wv "$function exit"
> }
>
> function GetFileList
> {
>         Param(
>                 [string] $item
>         )
>
>         $function = 'GetFileList:'
>
>         wv "$function begin, item '$item'"
>
>         $params = New-Object System.Collections.Arraylist
>         $params.AddRange( @(
>                 '/L',
>                 '/S',           # include subdirectories, but not empty ones
>                 '/NJH',         # no job header (no logo)
>                 '/BYTES',       # show filesizes in bytes
>                 '/FP',          # include full filename of file in path output
>                 '/NC',          # don't log file classes
>                 '/NDL',         # don't log directory names
>                 '/TS',          # include source filename timestamps
>                 '/XJ',          # exclude junction points
>                 '/R:0',         # number of retries on failed I/O
>                 '/W:0' ) )      # how long to wait between retries
>
>         $countPattern = '^\s{3}Files\s:\s+(?<Count>\d+).*'
>         $sizePattern  = '^\s{3}Bytes\s:\s+(?<Size>\d+(?:\.?\d+)\s[a-z]?).*'
>
>         wv "$function initiating robocopy"
>         $roboResults  = robocopy.exe $item Nothing $params
>         wv "$function robocopy complete"
>         wv "$function result from robocopy has $( $roboResults.Count ) 
> results"
>
>         [int] $count = 0
>         foreach( $result in $roboResults )
>         {
>                 $count++
>                 if( ( $count % 1000 ) -eq 0 )
>                 {
>                         wv "$function count $count of $( $roboResults.Count )"
>                 }
>
>                 if( $result -match 
> '(?<Size>\d+)\s(?<Date>\S+\s\S+)\s+(?<FullName>.*)' )
>                 {
>                         try
>                         {
>                                 New-Object PSObject -Property @{
>                                         FullName   = $matches.FullName
>                                         NameLength = $matches.FullName.Length
>                                         Size       = $matches.Size
>                                         Date       = [Datetime]$matches.Date
>                                         Hash       = ''
>                                 }
>                         }
>                         catch
>                         {
>                                 wv "fault: line     = '$result'"
>                                 wv "fault: fullname = '$( $matches.FullName 
> )'"
>                                 wv "fault: namelen  = '$( 
> $matches.FullName.Length )'"
>                                 wv "fault: size     = '$( 
> $matches.Size.ToString() )'"
>                                 wv "fault: date     = '$( $matches.Date )'"
>                         }
>                 }
>                 else
>                 {
>                         wv "$function nomatch: $result $( if( $result ) { 
> $result.Length } else { 'null' } )"
>                 }
>         }
>
>         wv "$function exit, count = $count"
> }
>
>         ##
>         ## Main
>         ##
>
>         $global:fsiresult = GetFileList $item
>
>         wv "Gonna calculate hashes"
>
>         $global:fsiresult |% { GetMyHash $_ }
>
>         wv "Done"
>
> -----Original Message-----
> From: [email protected] 
> [mailto:[email protected]] On Behalf Of Kurt Buff
> Sent: Wednesday, December 30, 2015 5:20 PM
> To: [email protected]
> Subject: Re: [powershell] Long file names - again...
>
> Ultimate goal is still the same - I want to get fullname, length and a hash 
> of each file (md5 or sha1, for duplicate detection only), so AFAICT 
> PowerShell is still the way to go.
>
> Unless I can use md5sum from unxtools or gnuwin32 toolsets, or something like 
> that.
>
> Not worried about junctions at this point. I probably should look at 
> New-PsDrive for grins.
>
> I'll also take a look at .NET CORE - though I doubt I can take advantage of 
> it if it hasn't been made easily available to powershell.
>
> Kurt
>
> On Wed, Dec 30, 2015 at 1:47 PM, Michael B. Smith <[email protected]> 
> wrote:
>> Actually, in this case, I think it's the .NET MAX_PATH (260).
>>
>> .NET itself, in its PathHelper class,  doesn't accept the '\\?\' syntax and 
>> explicitly throws an exception if the path exceeds 259 (ANSI) or 129 
>> (Unicode). This class is used all over .NET.
>>
>> Now, .NET CORE has been enhanced to support long paths.  But that isn't 
>> today... for most people.
>>
>> There are a few libraries (third party) you can use to add this support to 
>> PowerShell. They basically wrap the low-level APIs using P/Invoke (which you 
>> can do yourself, with Add-Type). They are much lower impact than installing 
>> Cygwin, but still... I guess my question is this: what are you really trying 
>> to do?
>>
>> Cmd.exe and the subst command are still my "go to solution" for long paths. 
>> You can also fake this in PowerShell with New-PsDrive (or "net share...") if 
>> you absolutely must stay in PowerShell, but it gets a little nasty if you 
>> need to worry about junctions.
>>
>> -----Original Message-----
>> From: [email protected]
>> [mailto:[email protected]] On Behalf Of Kurt Buff
>> Sent: Wednesday, December 30, 2015 4:13 PM
>> To: [email protected]
>> Subject: [powershell] Long file names - again...
>>
>> All,
>>
>> If anyone can help, I'd much appreciate it.
>>
>> I'm picking up where I left off some time ago in auditing our file server.
>>
>> I used robocopy to generate a list of files for each drive on our file 
>> server - all told over 10.3m lines, massaged the output (with findstr) to 
>> break it up by drive letter and to remove directories and things like 
>> $recycle.bin and 'system volume', then further massaged the output to remove 
>> the extraneous robocopy markings. I had to break it into smaller files by 
>> partition because processing the file in powershell overran RAM on a 16g 
>> machine.
>>
>> I then took each line (which looked like, e.g.
>> i:\somedirectory\otherdirectory\file), then prepended '\\?\' to each 
>> line), because some number of the files have path lengths greater 
>> than
>> 260 characters, and I'm hoping that using this specification will allow 
>> access to those files without adding funky 3rd party tools.
>>
>> So, I've ended up with a set of text files that have many lines that look 
>> like this:
>>      \\?\i:\somedirectory\otherdirectory\file
>>
>> What I'm trying to do is illustrated by the following, but I'm getting no 
>> output from it - it just returns without any output after a few moments.
>>
>>      $files = get-content c:\batchfiles\file-i.txt
>>      foreach ( $file in $files )
>>      {
>>         get-childitem $file | select length, fullname
>>      }
>>
>>
>> However, if I strip the '\\?\' from each line, it does what I want - but of 
>> course the script fails as soon as it encounters a file that has a 
>> name/directory specification that exceeds the Win32 API limit.
>>
>> I've tried surrounding the string with both double and single quotes, and 
>> still no joy.
>>
>> A simpler example tells the tale:
>>
>> This works, except for long file names:
>>      gci i:\somedirectory\otherdirectory\file
>>
>> These fail silently:
>>      gci \\?\i:\somedirectory\otherdirectory\file
>>      gci "\\?\i:\somedirectory\otherdirectory\file"
>>      gci '\\?\i:\somedirectory\otherdirectory\file'
>>
>> These fail with an error:
>>      gci "\\\\?\\i:\somedirectory\otherdirectory\file"
>>      gci '\\\\?\\i:\somedirectory\otherdirectory\file'
>>
>> The error is:
>> Get-ChildItem : Cannot retrieve the dynamic parameters for the cmdlet.
>> Cannot process argument because the value of argument "path" is not valid. 
>> Change the value of the "path" argument and run the operation again.
>> At line:1 char:1
>> + gci "\\\\?\\i:\CFRemoteImages\Air Canada Montreal STOC.vhd" | 
>> + select
>> length, ful ...
>> + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>>     + CategoryInfo          : InvalidArgument: (:) [Get-ChildItem],
>> ParameterBindingException
>>     + FullyQualifiedErrorId :
>> GetDynamicParametersException,Microsoft.PowerShell.Commands.GetChildI
>> t
>> emCommand
>>
>>
>> ================================================
>> Did you know you can also post and find answers on PowerShell in the forums?
>> http://www.myitforum.com/forums/default.asp?catApp=1
>>
>>
>> ================================================
>> Did you know you can also post and find answers on PowerShell in the forums?
>> http://www.myitforum.com/forums/default.asp?catApp=1
>
>
> ================================================
> Did you know you can also post and find answers on PowerShell in the forums?
> http://www.myitforum.com/forums/default.asp?catApp=1
>
>
> ================================================
> Did you know you can also post and find answers on PowerShell in the forums?
> http://www.myitforum.com/forums/default.asp?catApp=1


================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1


================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1

Reply via email to