We are using a perl script that connects to remote machines and creates
a dump to the localmachine.  A shell script that uses this perl script
is added to the crontab and run nightly.  You can then use amanda to
backup these dumps.  I've attached these scripts.  Just edit the shell
to your liking and create the folder you want the dumps saved.  Oh, and
the shell script is setup to delete any dumps older than 60 days.

Hope this helps.  It's simple and reliable.

--
Brian


On Thu, 2002-04-04 at 11:36, John Rosendahl wrote:
> Does anyone know of a tryly slick way to back up mysql databases.
> Right now I am trying to write a script which read-only locks the 
> databases 
> then runs amanda and then unlocks the data bases.  I am having fun doing it
> but I see little reason to re-invent the wheel if someone else has 
> already done this.
> I also could be going about this in an entirly wrong headed manner.  All 
> I know is
> I will not have the diskspace to hold an entire mysql hotcopy on disk, 
> so I am
> trying to find a way to have amanda back up the database files directly. 
>  The
> ultimate solution would be to have amanda go through and lock the databases
> individualy as it works but that I fear is close to impossible.  Oh well.
> thanks in advance for any help
> -john
> 
> 
> 
> 

#!/usr/bin/perl 
use Getopt::Long;

my ( $userid, $password, $host, $database, $help, $outdir );
GetOptions("u=s" => \$userid,
           "p=s" => \$password,
           "host=s" => \$host,
           "db=s" => \$database,
           "h" => \$help,
           "dir" => \$outdir, 
           "<>", \&extraArgs);
 
# Make sure we have all the inputs we need
if ( $help || !$userid || !$password || !$database )
{
   printUsage();
}

if ( !$host ) { $host = "localhost"; }

# find out where mysqldump command lives
chomp ( $dumpcmd = `which mysqldump` );
unless ( $dumpcmd ) 
{ 
   $dumpcmd = "/usr/local/mysql/bin/mysqldump";
   unless ( -e $dumpcmd ) { die "ERROR: Couldn't find mysqldump command\n"; }
}

# build the output file name
$outfile = $database . "_at_" . $host . "_on_" . time . ".sql";

# Create a directory to put the output in unless one was specified
unless ( $outdir )
{
   $home = $ENV{HOME};
   $outdir = "$home/db";
}

unless ( -e $outdir && -d $outdir )
{
   umask 0;
   mkdir $outdir, 0777;
}

$output = $outdir . "/" . $outfile;

# Log what we are doing
print "$dumpcmd -c -u $userid -p$password -h $host $database > $output\n"; 

# Do the dump
system("$dumpcmd -c -u $userid -p$password -h $host $database > $output"); 


# Compress it
$gzip = `which gzip`;
chomp $gzip;
unless ( $gzip ) 
{
   $gzip = "/bin/gzip";
   unless ( -e $gzip ) { die "ERROR: Couldn't find gzip command\n"; }
}

system("$gzip $output");

####
# printUsage
####
sub printUsage
  {
    print "
Usage: $0 -h=username -p=password -db=database [-host=hostname] [-dir=output_directory]
       -h                         # help (This text)
       -u=username                # user login for accessing database
       -p=password                # user password for accessing database
       -host=hostname|ip address  # Server the database is running on
       -db=database name          # name of db to backup
       -dir=output directory      # where to store output files
\n";
    exit 0;
  }



#--------------------------------------------------------
# Subroutine: extra_args
#
# Purpose: To print an error message and kill the program
#          if unexpected command line arguments are found
#--------------------------------------------------------
sub extraArgs {
   my ($bad_arg) = @_;
   print "Invalid argument [$bad_arg] passed to $0\n";
   print_usage();
}
#!/bin/bash

echo "Removing files older than 60 days.."
/usr/bin/find /home/<AccountToUse>/db -mtime +60 -print -exec rm -f {} \;

/home/<AccountToUse>/backupMySQL.pl -u=databaseaccount -p=password -host=hostnameorip -db=DBNAME

Reply via email to