-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Greg Wallace wrote:

> I do indeed now have a /var/log/firewall file.  I took a look at its
> contents and it looks just like what was going into messages.  So, maybe
> that is fixed.  Now what I'd like to do is rotate out the current huge
> messages file and start with a new one.  Can you tell me how to safely do
> that?
> 
> Thanks,
> Greg Wallace
> 
> 

Something you might like to consider, now you're using syslog-ng, is getting it 
to have a different log file for each
day.  Here are the "destination" and "log" entries in my syslog-ng config file:


###############################################################
destination syslog      {file("/var/log/$FACILITY.log.$YEAR$MONTH$DAY");};
destination full-syslog {file("/var/log/system.log.$YEAR$MONTH$DAY");
###############################################################
log {source(src); destination(syslog); };
log {source(src); destination(full-syslog); };
###############################################################


What this will give you is, for example, /var/log/system.log.20070119 which 
will contain all the syslog messages, and
files like /var/log/local0.log.20070119, /var/log/mail.log.20070119 etc. which 
will contain the messages for each
facility.  It makes tracking things down a bit easier.

Rather than use logrotate (most of my systems are Solaris) I use the following 
shell script to gzip the log files and
put them in /var/log/archive (Which I have on a separate filesystem).  It's run 
at 23:59 each night:


====cut here===
#!/bin/bash
cd /var/log

# Get TODAY's date.  As we're running just before midnight, go to
# sleep until after midnight so that the log files will no longer
# be being written to

DAT=`date "+%Y%m%d"`
sleep 65

# As we're running from cron, output a friendly message to say what we're doing
echo "Compressing and archiving log files: \n"

# get a single colum list of all the log files that were generated then gzip
# each file before moving to /var/logs/archive
ls -1 *.log.${DAT} |
while read fil
do
   echo $fil
   gzip $fil
   mv ${fil}.gz /var/log/archive
done

# Log files are retained for a maximum of 1 year/366 days (to cater for leap 
years)
# We remove everything older than 345 days because our oldest backups are 21 
days old
# and will have 345 days' worth of logs on them

echo "Removing archived logs older than 345 days:\n "
cd /var/log/archive
# the above cd isn't really necessary as the find command explicitly states 
where to search.
# DON'T be tempted to do  find . -name "*.log.gz"  ... you might just regret 
it! :)

# Find all the log files older than 345 days and remove them.  Echo the file 
name so it gets
# captured in the cron output

find /var/log/archive -name "*.log.*.gz" -mtime +345 |
while read arc
do
  rm ${arc}
  echo ${arc}
done

====cut here===


The crontab entry is:

59 23 * * *                                 /usr/local/scripts/clear_logs



(I keep my home-brewed stuff in /usr/local/scripts)


Hope that helps


- --
Paul Walsh


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFFsJ7EMgcRY6KQRjQRAtVJAJ9mnZwwCAHBtRz1IX5rv6BZEfzYCACfQncE
POYiy5wvkSkiUNMLUPucTIU=
=8GlG
-----END PGP SIGNATURE-----
-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to