Hi,
I have a 61Gig base at the moment and do a full online backup each
night.
It's not really that much of a strain so I haven't bothered with cooking
up a scheme for differential backups. Using my simple scripts it takes
one hour and in my case I end up with 2.5Gigs (compressed) worth of
backup files. The backup claims two cpu's for the hour that the job runs
but on a multi-cpu box it's not that much trouble.

My scripts are:
#! /bin/sh
if test $# -lt 2; then
  echo "Usage: dbbackup <basename> <filename>"
else
  /home/postgres/postgresql/bin/pg_dump -h $HOSTNAME $1 | gzip -f - |
split --bytes 600m - $2.
fi

and
if test $# -lt 2; then
  echo "Usage: dbrestore <basename> <filename>"
else
  cat $2.* | gzip -d -f - | /home/postgres/postgresql/bin/psql -h
$HOSTNAME -f - $1
fi

Cheers,

John

>>> "Joseph M. Day" <[EMAIL PROTECTED]> 03/23/05 8:41 PM >>>
It looks like pg_dump is the equivalent of a full backup, but
how do I keep the equivalent of a Incremental, or Differential backup.
 
To keep the same functionality, I will need a full backup once a week,
and a differential once a day.


---------------------------(end of broadcast)---------------------------
TIP 4: Don't 'kill -9' the postmaster

Reply via email to