Good Afternoon

I am attempting to develop a script that will parse a directory listing and 
return only directory names that match a given expression.

It would make sense to me to use File::Find to do this but based on the dir 
structure I am parsing, the amount of overhead to do this is enourmous !

Basically, I have a file structure similar to:

Dir1\Dir2\Support\119404\dirx\diry
Dir1\Dir3\Support\119893\dirx
Dir1\Dir4\Support\188884\dirx\diry\dirz
.....
Dir1\Dir1000\Support\100858

I am simply interested in finding the directories directley under the Support 
dir (ex.119404 from the 1st example) . There is no consistancy to the naming 
convention other then Dir1 and Support. Dir2 can be many different values.

I tried functionality similar to the following that did work on a much smaller 
test bed:

my $dirs="I:\\ID_000000_000999";
find sub { push @dirs, $File::Find::dir if $File::Find::dir =~ 
m/.+[Ss]upport\/\d+$/;}, $dirs;

But in a larger scale dir structure. The performance of this was horrible !! 
(Since it is looking through the entire structure including dirs under the 
directory I am trying to match on).

As you can see form the I:\\ this is on windows, so ls and similar UNIX 
commands are not available.

Any thoughts on how I can accomplish this task with the lowest amount of 
overhead?

Thanks!
Jason

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to