Hi all

I'm writting a program in perl to md5sum about 500,000 files these files are 
text
files and have different files size the biggest being about 500KB. my code is 
below

#!/usr/bin/perl
use strict;
use Cwd; # module for finding the current working directory
use Digest::MD5 qw(md5);


sub ScanDirectory
{
 
    my $workdir = ("/mad-server/");     
    my ($workdir) = shift; 
    my ($startdir) = &cwd; # keep track of where we began

    chdir($workdir) or die "Unable to enter dir $workdir:$!\n";
    opendir(DIR, ".") or die "Unable to open $workdir:$!\n";
    my @names = readdir(DIR) or die "Unable to read $workdir:$!\n";
    closedir(DIR);

    foreach my $name (@names){
        next if ($name eq "."); 
        next if ($name eq "..");

        if (-d $name){                  # is this a directory?
            &ScanDirectory($name);
            next;
        }
        my $base_dir = ("/mad-server");
        my $full_path = join("/", $base_dir, $workdir, $name);
        my $file = shift || "$full_path";
        open(FILE, $file) or die "Can't open '$file': $!";
        binmode(FILE);
        print Digest::MD5->new->addfile(*FILE)->hexdigest, " $file\n";
        chdir($startdir) or die "Unable to change to dir $startdir:$!\n";

}
        
}# Subroute ScanDirectory
&ScanDirectory(".");

The thing is that it is taking about 3 to 4 hours to complete would these be
about right on this number of files?
This there any way to speed things up? if so any example would be good or a 
point
in the right way too? If I do not md5sum the files it print to the screen in
about 2 mins?

Thank you

Benjamin

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>

Reply via email to