Would it be feasable to do one of the following two things:

Keep the the old files in a diferent directory and only backup certain
directories. Or possable gzip the old ones so that they arent so large. I
usually find that gzip will compress dumped database files to 10x smaller
than they were.

or

Just use amanda like it normally does things and it will only backup ones
that it does not allready have. If you have to keep the databases that are x
days old then every 10 days it would back all of them up and you would not
have to wory about overwriting your old tapes with new ones. That way you
would always have one of every database on tapes and one on the partition.

Regards,

Ryan Williams



----- Original Message -----
From: "Wood, David" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, February 14, 2001 8:46 PM
Subject: Configuration question


> Hi,
>
> I have a filesystem "/sybase_dumps".  Each night a script dumps Sybase
> databases to this filesystem and the files are quite large.  There is a
> purge script which cleans out the filesystem of files more than X days
old -
> we need to keep X days worth of dump files on-line.  Basically, I only
want
> Amanda to backup new dump files (as much as possible) within this
> filesystem.  Does anyone have advice on how to configure Amanda for this
> filesystem?
>
> I've thought about using the 'incronly' strategy; however, the usefulness
of
> this strategy breaks down when dump levels reach 9.  I've thought about
> switching to GNUTAR and tinkering with the source to have DUMP_LEVELS =
> 2147483647 (size of int); however, I'm afraid of breaking other parts of
the
> code (will I?).  I imagine the bump parameters are going to give me grief
> too.
>
> Thanks for any advice ... David
>

Reply via email to