Thanks for all help.

This is what I now have:

#!/usr/bin/perl
my $startdir = "/opt/log/hosts/";
use File::Find;
#use strict;
use warnings;
my @dirlist;
 
@logfile = ("cron","messages","maillog","ldap");
foreach $log (@logfile) {
sub eachFile {
 if (-e $_ && $_ =~ /$log$/) { push @log, $File::Find::name;}
                }
 
                                                                                 
find (\&eachFile, "$startdir");
                }
foreach $file(@log){
system("gzip $file");
print "$file done!\n";
 
                }

I went a different route.  I am now using File::Find to search for the contents of the 
array @logfile and just gzip them.  Works the same, and easier to manage individual 
logs than direcories.

I now have run into another problem.  It is gziping any of the logs.  I only want it 
to gzip the previous days logs.  I thought about writing something that looks at the 
date and gzips anything with the previous days date, but i am noticing that some logs 
are are written to very often, like mail logs, they have a date stamp of the next day, 
but with a time of 00:00.

Any suggestions?

Again, thanks for the help before.

--Keith

On Thu, 17 Jul 2003, Keith Olmstead wrote:

Date: Thu, 17 Jul 2003 14:51:41 -0500 (CDT)
From: Keith Olmstead <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Subject: Newbie help

Hello,

Been searching though this list for awhile now, and now I am needing some help.  I am 
not asking for someone to do my code for me,  I am trying to learn perl and the only 
way for me to do that is to dive staight into it.

My problem is with the theory of the script that I am trying to write.  I am needing 
something to backup logs for me from a central log server.  They layout of these files 
are as follows.

/host/ip/year/month/day/log1, log2, etc

Every file and dir under dir1 is created dynamically by the logging program, syslog-ng.

What I am wanting to do is create a script that will tar and gzip the day dirs at the 
end of the day and remove the dir after it has been backed up.  After the month is 
finished, I would like the same done for it.

Currently there are 20ish host dirs, one of each server that is logging to this box.  
There will be more and when they get pointed to this server, the ip dir will be 
created for that host and each dir under that will also be created for the 
corresponding date.  The logs need to be kept for 2-3 months, and then deleted.

I am needing help thinking this script out, maybe get ideas of how to set it up.  From 
reading using File::Find, might be useful.  

I was thinking about writing a script that runs in cron each night that tars and gzp 
the log dirs.  Currently I have a script that is getting a list of the dir, but I 
don't really know  where to go from there.  I need to get it to dive into the dir to 
the day lvl and archive the previous days logs.

Here are 2 scripts that I have been playing with.  I don't even know if I am going in 
the right direction.

#!/usr/bin/perl
use strict;
use warnings;
 
my $dir = '/opt/log/hosts/';
 
opendir(DIR, $dir) or die "Cannot open Directory";
 
# read the dir contents into a list, and grep out the . and .. dir entries
my @entries = grep (!/^\.\.?$/ , readdir (DIR));
closedir(DIR);
foreach (@entries)
{
  print "$_\n";
}

and

#!/usr/bin/perl
 
use File::Find;
use strict;
use warnings;
my $startdir = $ARGV[0];
my @dirlist;
find(
    sub {
        return if -d and /^\.\.?$/;
        push @dirlist, $_ if -d;
    }, $startdir);
 
#my $file_list = join '<BR>', @dirlist;
my $file_list = @dirlist;
print $file_list;

Like I said before, I am not asking for someone to do my work just some guidiance in 
the right direction.

TIA,

Keith Olmstead






-- 


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to