Here's a script that I use as a basic and modify according to the users needs. 
If has options for filesystems, mysql and postgres databases, currently backing 
up to a nfs mounted partition, although there are scp options commented out as 
well - mod to your hearts content...

Steve
-- 8< --
#!/bin/sh
#
# Author:       Steve Holdoway
# Version:      1.0
# Date:         20th Dec 2004
# Purpose:      To back up volatile files to the backup server over the 
#               gigabit san.
#
# Gotchas:      As this is run from cron, and I use gnu tar, there are
#               issues with finding the correct zip program due to the
#               lack of PATH setup. I could have just built up the
#               required PATH, but instead I have used absolute paths...
#               however, this has meant that the compression program needs
#               to be explicitly set, rather than just using the --gzip flag
#
#               Conversion from Slowlaris -> Linux. Native tar ok.

# Setup Programs
# We're using a copy of gnu tar, 'cos solaris tar is crap
Tar="/bin/tar --create --verbose --ignore-failed-read --gzip"
Scp="/usr/bin/scp -P 2222 -q"
Rm="/bin/rm"
Find="/usr/bin/find"
Gzip="/bin/gzip"
MySQLDump="/usr/local/mysql/bin/mysqldump"
PGDump="/usr/local/pgsql/bin/pg_dump"
PGDumpAll="/usr/local/pgsql/bin/pg_dumpall"
Mount="/bin/mount"
Umount="/bin/umount"


# Setup Environment
DateStamp=`/bin/date +%y-%m-%d`
HostName="sitehound"
BackupHost="[EMAIL PROTECTED]"
Archive="Archive.tar.gz"
List="FileList.txt"
MysqlDB="mysql.sql.gz"
PostgresqlDB="postgresql.sql.gz"

# Setup Backup specific stuff
Dirs="/bin /boot /etc /home /initrd /lib /misc /opt /root /sbin /selinux /srv 
/sys /usr /var /www"
ExcludeDirs="--exclude /usr/local/honeypotserver"
ArchiveDir="/backup/${HostName}"


# NFS mount the correct directory...
$Mount -t nfs <my.backup.server>:/backup /backup

# Move
if [ ! -d $ArchiveDir ]
then
        $MkDir -p $ArchiveDir
fi
cd $ArchiveDir 

# Force cleanup
$Rm -f $Archive $List $MysqlDB $PostgresqlDB

# Create a new backup set
$Tar --file=${ArchiveDir}/$Archive $Dirs $ExcludeDirs >> ${ArchiveDir}/$List 
2>/tmp/null

# Save all the mysql databases
$MySQLDump -u root -p<pwd> --all-databases | $Gzip > ${ArchiveDir}/$MysqlDB

# and postgres
#$PGDumpAll -U postgres | $Gzip > ${ArchiveDir}/$PostgresqlDB
$PGDump -U postgres db1 | $Gzip > ${ArchiveDir}/db1.sql.gz
$PGDump -U postgres db2 | $Gzip > ${ArchiveDir}/db2.sql.gz


# and save it across to the backup server
#$Scp ${ArchiveDir}/$Archive 
${BackupHost}:/backup/${HostName}/${DateStamp}.${Archive}
#$Scp ${ArchiveDir}/$List ${BackupHost}:/backup/${HostName}/${DateStamp}.${List}
#$Scp ${ArchiveDir}/$MysqlDB 
${BackupHost}:/backup/${HostName}/${DateStamp}.$MysqlDB
#$Scp ${ArchiveDir}/$PostgresqlDB 
${BackupHost}:/backup/${HostName}/${DateStamp}.$PostgresqlDB

# Now we've finished, unmount the share.
cd /
$Umount /backup

-- 8< --

On Sat, 12 May 2007 20:59:49 +1200
Volker Kuhlmann <[EMAIL PROTECTED]> wrote:

> Assuming (without prejudice) a member of the $RELATIVE, $SO or
> $I_WANT_THIS_TO_WORK_BUT_DONT_CARE_HOW category, what backup solutions
> exist, obviously with GUI, for backing up filesystems onto an external
> harddisk? It doesn't have to implement a sophisticated backup strategy,
> but something basic, like making updates without doing the equivalent of
> 
> cp /home /media/disk
> 
> would be helpful. The user in aforementioned category has almost
> certainly no in-depth knowledge of which directory on the system fulfils
> what purpose.
> 
> Personally I use rsync, but I'm in a different user category so that's a
> dead end.
> 
> Did anyone look into this, and/or could make some suggestions?
> 
> Thanks,
> 
> Volker
> 
> -- 
> Volker Kuhlmann                       is list0570 with the domain in header
> http://volker.dnsalias.net/   Please do not CC list postings to me.

Reply via email to