*O*ne of the hosts being backed up is a photographer's computer with a
huge 7 TB store of photos. Many of the photos are multi-layer photoshop
files some larger than a GB each. The vast majority of this storage is
archival. Only a small percentage of photos are changing or being added
to at any moment in time.
What would be the consequences of setting the backuppc parameters so
this host never gets another full backup? The documentation speaks of an
ever-increasing number of deltas and a tradeoff in speed.
For such a host with many very large files and a very large file
storage, what would be the optimum suggested settings for:
*
$Conf{FullPeriod} --- which we want large enough to ensure we NEVER
do another full backup
*
*
$Conf{FillCycle} --- if we set this >0 would it slow down the
day-to-day backup operation?
*
With these very large files is there any issue with setting*
$Conf{FullKeepCnt} >1 ?
*
*
I must admit I'm confused by the documentation for *$Conf{FullKeepCnt}:***
*
"In the steady state, each time a full backup completes successfully the
oldest one is removed. If this number is decreased, the extra old
backups will be removed."
Does this still apply since "fullkeepcnt" in v4+ means "filledkeepcount"?
Thanks for any advice,
Bob Katz
--
If you want good sound on your album, come to Bob Katz 407-831-0233
DIGITAL DOMAIN MASTERING STUDIO Author: *Mastering Audio *Digital Domain
Website <http://www.digido.com/> No trees were killed in the sending of
this message. However a large number of electrons were terribly
inconvenienced.
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/