John Goodleaf wrote:
> 
> Here's the thing. I've written a script to recurse through a
> directory tree and find all the changed files therein. I've got a db
> file to hold checksums and so forth, but that's beside the point.
> I'm using File::Find to do the recursion for me, but I can't seem to
> figure out how to keep it from recursing through directories I don't
> want. For example, I don't want stuff from dotdirectories or from
> browser caches. I have been trying to use tests to ferret that out,
> but the script seems to read the files anyway.
> <snip>
> if (-s
>     && -f

You should use the _ special handle to avoid stat-ing the file more than
once.

      && -f _


>     && $File::Find::dir !~ /[Cc]ache/
>     && $File::Find::dir !~ /\/\.*\//) {
> 
>     print "DEBUG: File found is : $_\n";
> 
>     open(FILE, $_) or print REPORT "Could not open
> $File::Find::name: $!\n";
>     binmode(FILE);
>     my $digest=Digest::MD5->new->addfile(*FILE)->hexdigest;
>     close FILE;
> Am I barking up the wrong tree? Any suggestion for other useful
> modules?

You want to use $File::Find::prune to bypass subdirectories:

    -d && /^\.|[Cc]ache/ && ($File::Find::prune = 1) && return;



John
-- 
use Perl;
program
fulfillment

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to