Thank you!!!!
On Mon, Jul 7, 2014 at 11:57 AM, Rene Churchill <[email protected]> wrote: > This is the script that I use. It uses mysqlhotcopy to quickly copy the > database (reducing the amount of time the website locks up at 3am) and then > mysqldump to create the nightly backup file. It uses bzip2 to compress the > dump and stores the resulting file in a dated sub-directory. The last > couple of lines of the script prune back the sub-directories to only keep 7 > days worth around. > > Rene > > #!/bin/csh -f > # > # A quick shell script to dump all of the database > # data from my webserver onto the local office server > # so that I won't lost as much information next time. > # > # NOTE: This script assumes you have correctly setup > # the .my.cnf file for the user running this script > # so that no passwords are needed to run mysql, mysqldump > # or mysqladmin > > set backup_dir = '/home/tbilling/mysql_backup'; > set mysql_data_dir = '/var/lib/mysql'; > > # Make sure we're in the right directory > cd $backup_dir > > # First flush any data left waiting to get to the tables. > /usr/bin/mysql > /tmp/mysql.$$ <<EOF > flush tables; > EOF > > # Next, get the list of databases supported by this MySQL > # server. > /usr/bin/mysql > /tmp/mysql.$$ <<EOF > show databases; > EOF > > # Next, create a subdirectory to hold the dumped data in > set date = `/bin/date +%Y_%m_%d` > if (! -d data/$date) then > /bin/mkdir data/$date > endif > > # One by one, dump the various databases that are stored on the > webserver. Use mysqlhotcopy > # to create a copy of the database first, to reduce downtime of the > website. Ignore the > # information_schema database because this is a variation of the mysql > database. Ignore > # the test database because that is just for testing and usually has > nothing in it. > foreach database ( `cat /tmp/mysql.$$` ) > if ($database != 'Database' & $database != 'information_schema' & > $database != 'test') then > /usr/bin/mysqladmin create hotcopy_backup > /usr/bin/mysqlhotcopy -q $database $mysql_data_dir/hotcopy_backup > /bin/mv -f $mysql_data_dir/hotcopy_backup/$database/* > $mysql_data_dir/hotcopy_backup > /bin/rmdir $mysql_data_dir/hotcopy_backup/$database > /bin/echo Dumping $database ... > /bin/nice mysqldump --opt --quick hotcopy_backup | /usr/bin/bzip2 > -q > data/$date/$database.sql.bz2 > /usr/bin/mysqladmin --force drop hotcopy_backup > # /bin/sleep 30 > endif > end > /bin/rm /tmp/mysql.$$ > > # If there are more than 7 days of backup files, delete the extras > foreach file ( `/bin/ls -tc data | /usr/bin/tail -n +8` ) > # /bin/echo Deleting data/$file ... > /bin/rm -r data/$file > end > > On 7/7/2014 10:54 AM, Paul Flint wrote: > > Dear David, > > To the extent that you feel comfortable, can you send out copies of the > backup scripts? > > Regards, > > Paul > On Mon, 7 Jul 2014, Tisdell, Dave wrote: > > Date: Mon, 7 Jul 2014 10:52:26 -0400 > From: "Tisdell, Dave" <[email protected]> <[email protected]> > Reply-To: Vermont Area Group of Unix Enthusiasts <[email protected]> > <[email protected]> > To: [email protected] > Subject: scripting mysql database backups > > Hi all, > > I have several mysql databases that I want to run separate backups of > rather than doing an entire database dump into 1 file. > The idea is that each database backup would go into its own directory at > the end of the backup path i.e. > /dbbackups/database1/db-back1 > /dbbackups/database 2/db-back2 > etc > What I have done so far is use the sql "SHOW DATABASES" command to > redirect > to a dblist text file. What I would like the script to do is go through > the > file and backup every database in the list. I am already using sed to > remove the word database that gets generated by the show command and any > databases I don't want backed up from the list so that the result is a > list > of databases that want backed up. > Periodically databases get added. The idea is that the backup will > automatically add the new databases without my editing the script based > upong the list generated by the "Show Databases" command. > I am getting to a level of complexity in scripting that I have not done > before. I want the same set of commands run on every db in the list. > Thanks. > > Dave > > -- > David Tisdell. Music Teacher > Browns River Middle School > 20 River Road > Jericho, VT 05465 > [email protected] (e-mail) > > -- > > > This e-mail may contain information protected under the Family Educational > Rights and Privacy Act (FERPA). If this e-mail contains student > information > and you are not entitled to access such information under FERPA, please > notify the sender. Federal regulations require that you destroy this > e-mail > without reviewing it and you may not forward it to anyone. > > > Kindest Regards, > > > > Paul Flint > (802) 479-2360 Home > (802) 595-9365 Cell > > /************************************ > Based upon email reliability concerns, > please send an acknowledgement in response to this note. > > Paul Flint > 17 Averill Street > Barre, VT > 05641 > > -- > ------------------------------ > René Churchill > VP of Development (i.e. Geek #2) > WherezIt.com - Your source for Local information > [email protected] > 802-244-7880 x527 > http://www.wherezit.com/ > > -- David Tisdell. Music Teacher Browns River Middle School 20 River Road Jericho, VT 05465 [email protected] (e-mail) -- This e-mail may contain information protected under the Family Educational Rights and Privacy Act (FERPA). If this e-mail contains student information and you are not entitled to access such information under FERPA, please notify the sender. Federal regulations require that you destroy this e-mail without reviewing it and you may not forward it to anyone.
