I have this script that recursed through a directory on a network drive and
it's subdirs flagging files based on a couple of params.
Firstly, I know this will be resource intensive. Is there a better way to do
this that is maybe not even a Perl solution?
Secondly, the script is failing to recurse! It just goes a couple of levels
and stops!! It changes to the top-level dir ok and opens it ok but that is
it.
I know about file find but I do not want to use it.
I am running NT Workstation, the mapped network drive is on a Novell Server.
Thanks for any help on this.
Here is the script:
------------
# script will recurse through specified directory logging files that have
# not been modified in 60 days and are over 50MB.
# use strict;
use Cwd;
use File::Spec;
# system ('h:') or die "can't get to h:\n";
$log = 'c:\\temp\\chk_space.txt';
$path = 'dbm\\marketing_database_II';
open(LOG, ">$log") or die "Can't open $log: $!";
print LOG "These files are over 60 days old and over 50MB\n\n";
&ScanDir($path);
sub ScanDir
{
my ($workdir) = shift;
my ($start) = File::Spec->curdir();
my @stuff;
# my ($age, $size);
system(h:); #this fails to for some reason, so I have to manually do
it
chdir($workdir) or die "Can't cd to $workdir: $!\n";
opendir(DIR, ".") or die "Can't open directory $path: $!\n";
@stuff = grep (!/^\.\.?$/ , readdir(DIR));
closedir(DIR);
foreach my $file (@stuff) {
# next if $file =~ /^\.\.?$/;
next if $file =~ /mediaCreate/;
if (-d $file) {
&ScanDir($file);
next;
}
if(((-M $file) > 60) && ((-s $file) > 1024 ** 2 * 50 )) {
print LOG "$workdir/$file\n";
}
chdir($start) or die "Unable to CD to $start: $!\n";
}
}
close LOG;
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]