You can use a combination of tools, including PowerShell.
You can use robocopy to create the filelisting (as shown by Mark Weber), turn
each line into a PSCustomObject, then calculate the filehash for each file.
If the total filename length is "too long", create a share/subst/psdrive to
allow you to get deeper into the filesystem.
$md5 = New-Object System.Security.Cryptography.MD5CryptoServiceProvider
$hash = [BitConverter]::ToString( $md5.ComputeHash( [IO.File]::Open(
$filePath, 'Open', 'Read' ) ) )
Get-FileHash may work, but I _think_ that PowerShell expands the paths.
If this doesn't work for you -- you gotta go down the P/Invoke path.
-----Original Message-----
From: [email protected] [mailto:[email protected]] On
Behalf Of Kurt Buff
Sent: Wednesday, December 30, 2015 5:20 PM
To: [email protected]
Subject: Re: [powershell] Long file names - again...
Ultimate goal is still the same - I want to get fullname, length and a hash of
each file (md5 or sha1, for duplicate detection only), so AFAICT PowerShell is
still the way to go.
Unless I can use md5sum from unxtools or gnuwin32 toolsets, or something like
that.
Not worried about junctions at this point. I probably should look at
New-PsDrive for grins.
I'll also take a look at .NET CORE - though I doubt I can take advantage of it
if it hasn't been made easily available to powershell.
Kurt
On Wed, Dec 30, 2015 at 1:47 PM, Michael B. Smith <[email protected]> wrote:
> Actually, in this case, I think it's the .NET MAX_PATH (260).
>
> .NET itself, in its PathHelper class, doesn't accept the '\\?\' syntax and
> explicitly throws an exception if the path exceeds 259 (ANSI) or 129
> (Unicode). This class is used all over .NET.
>
> Now, .NET CORE has been enhanced to support long paths. But that isn't
> today... for most people.
>
> There are a few libraries (third party) you can use to add this support to
> PowerShell. They basically wrap the low-level APIs using P/Invoke (which you
> can do yourself, with Add-Type). They are much lower impact than installing
> Cygwin, but still... I guess my question is this: what are you really trying
> to do?
>
> Cmd.exe and the subst command are still my "go to solution" for long paths.
> You can also fake this in PowerShell with New-PsDrive (or "net share...") if
> you absolutely must stay in PowerShell, but it gets a little nasty if you
> need to worry about junctions.
>
> -----Original Message-----
> From: [email protected]
> [mailto:[email protected]] On Behalf Of Kurt Buff
> Sent: Wednesday, December 30, 2015 4:13 PM
> To: [email protected]
> Subject: [powershell] Long file names - again...
>
> All,
>
> If anyone can help, I'd much appreciate it.
>
> I'm picking up where I left off some time ago in auditing our file server.
>
> I used robocopy to generate a list of files for each drive on our file server
> - all told over 10.3m lines, massaged the output (with findstr) to break it
> up by drive letter and to remove directories and things like $recycle.bin and
> 'system volume', then further massaged the output to remove the extraneous
> robocopy markings. I had to break it into smaller files by partition because
> processing the file in powershell overran RAM on a 16g machine.
>
> I then took each line (which looked like, e.g.
> i:\somedirectory\otherdirectory\file), then prepended '\\?\' to each
> line), because some number of the files have path lengths greater than
> 260 characters, and I'm hoping that using this specification will allow
> access to those files without adding funky 3rd party tools.
>
> So, I've ended up with a set of text files that have many lines that look
> like this:
> \\?\i:\somedirectory\otherdirectory\file
>
> What I'm trying to do is illustrated by the following, but I'm getting no
> output from it - it just returns without any output after a few moments.
>
> $files = get-content c:\batchfiles\file-i.txt
> foreach ( $file in $files )
> {
> get-childitem $file | select length, fullname
> }
>
>
> However, if I strip the '\\?\' from each line, it does what I want - but of
> course the script fails as soon as it encounters a file that has a
> name/directory specification that exceeds the Win32 API limit.
>
> I've tried surrounding the string with both double and single quotes, and
> still no joy.
>
> A simpler example tells the tale:
>
> This works, except for long file names:
> gci i:\somedirectory\otherdirectory\file
>
> These fail silently:
> gci \\?\i:\somedirectory\otherdirectory\file
> gci "\\?\i:\somedirectory\otherdirectory\file"
> gci '\\?\i:\somedirectory\otherdirectory\file'
>
> These fail with an error:
> gci "\\\\?\\i:\somedirectory\otherdirectory\file"
> gci '\\\\?\\i:\somedirectory\otherdirectory\file'
>
> The error is:
> Get-ChildItem : Cannot retrieve the dynamic parameters for the cmdlet.
> Cannot process argument because the value of argument "path" is not valid.
> Change the value of the "path" argument and run the operation again.
> At line:1 char:1
> + gci "\\\\?\\i:\CFRemoteImages\Air Canada Montreal STOC.vhd" | select
> length, ful ...
> + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> + CategoryInfo : InvalidArgument: (:) [Get-ChildItem],
> ParameterBindingException
> + FullyQualifiedErrorId :
> GetDynamicParametersException,Microsoft.PowerShell.Commands.GetChildIt
> emCommand
>
>
> ================================================
> Did you know you can also post and find answers on PowerShell in the forums?
> http://www.myitforum.com/forums/default.asp?catApp=1
>
>
> ================================================
> Did you know you can also post and find answers on PowerShell in the forums?
> http://www.myitforum.com/forums/default.asp?catApp=1
================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1
================================================
Did you know you can also post and find answers on PowerShell in the forums?
http://www.myitforum.com/forums/default.asp?catApp=1