Martin Baehr wrote:
On Sun, May 15, 2005 at 04:37:14PM +1200, Steve Holdoway wrote:Here's a backup script I wrote. I find it simple to understand, and easy to modify, given that it needs to run on a number of different linux distributions, as well as Solaris 8, 9and 10. It is neither data manipulation nor batch processing.
agreed.
however, it is still good for stuff that is to complex for shells.
perl at least was designed to replace shell scripts and it works well in
that area.
ok, i realize that this statement is to broad when taken out of context. the context i had in mind was what i said about shell scripts before.
It was designed, not as you suggest, but to offer an alternative
alternative and replacement are pretty much the same thing. it can't be an alternative if it is not able to replace.
It was *not* designed to generically replace shellscripts except in this specific area.
well, i don't know what other areas shell scripts are used for except data manipulation and batch processing. and as i did say that shellscripts still are better for batch stuff, we are not necessarily in disagreement here.
greetings, martin.
I wouldn't dream of writing it any other way than as a shellscript.
Steve.
#!/bin/sh # # Author: Steve Holdoway # Version: 1.0 # Date: 20th Dec 2004 # Purpose: To back up volatile files to the backup server over the # gigabit san. # # Gotchas: As this is run from cron, and I use gnu tar, there are # issues with finding the correct zip program due to the # lack of PATH setup. I could have just built up the # required PATH, but instead I have used absolute paths... # however, this has meant that the compression program needs # to be explicitly set, rather than just using the --gzip flag # # Conversion from Slowlaris -> Linux. Native tar ok.
# Setup Programs # We're using a copy of gnu tar, 'cos solaris tar is crap Tar="/bin/tar --create --verbose --ignore-failed-read --gzip --files-from=-" Scp="/usr/bin/scp -q" Rm="/bin/rm" Find="/usr/bin/find" Touch="/bin/touch" Mkdir="/bin/mkdir" Ssh="/usr/bin/ssh" MySQLDump="/usr/bin/mysqldump" PGDumpAll="/usr/bin/pg_dumpall"
# Setup Environment DateStamp=`/bin/date +%y-%m-%d` HostName=`/bin/uname -n` BackupHost="login.nivi.no" BackupUser="security" Su="/bin/su - $BackupUser -c" Remsh="$Ssh $BackupHost" Archive="Incremental.Archive.tar.gz" List="Incremental.FileList.txt" MysqlDB="mysql.sql" PostgresqlDB="postgresql.sql"
# Setup Backup specific stuff
Dirs="/bin /boot /etc /home /initrd /lib /misc /opt /root /sbin /u01 /usr /var"
ArchiveDir="/backup"
TimeStamp=$ArchiveDir/TimeStamp
# Set the timestamp for Incrementals to work from. $Touch $TimeStamp
# Create a destination directory $Mkdir $ArchiveDir/$DateStamp
# Create a new backup set
$Tar --file=${ArchiveDir}/${DateStamp}/$Archive $Dirs >> ${ArchiveDir}/${DateStamp}/$List 2>/tmp/null
# Save all the mysql databases #$MySQLDump -u root --all-databases > ${ArchiveDir}/$MysqlDB
# and postgres
#$PGDumpAll -U postgres > ${ArchiveDir}/$PostgresqlDB# and save it across to the backup server
# Ensure there's a target directory
Target=${ArchiveDir}/${HostName}/${DateStamp}
$Su "$Remsh $Mkdir -p ${Target}"$Su "$Scp ${ArchiveDir}/${DateStamp}/* ${BackupHost}:${Target}"
[
