Unless you have a specific reason not to, absolutely! I would use 
multi-level incrementals. I've always liked a tower of hanoi backup scheme 
(http://www.google.com/search?hl=en&q=%22tower+of+hanoi%22+backup+rotation), 
but there are others that work well too.

That said, it's still in your best interest to get all of your data into a 
full if at all possible. I'd recommend finding the problem and correcting 
it (might be the timeout I mentioned, but I'm not certain).

As a last resort you could try to work around the problem by splitting that 
host into two (or more) hosts in the backuppc client config, each backing 
up a portion of the physical host. If you go that route, I'd recommend 
staggering the fulls so that they don't occur on the same day. I do this on 
a couple of hosts with lots of data.

If anyone else has other advice, I'm sure they'll chime in. :-)

On Thu, 21 Jun 2007, Tony Schreiner wrote:

> It is BackupPC 3.0 but recently upgraded from 2.1.2.
>
> I have only $Conf{IncrLevels} = [1];
> Are you suggesting I implement multi-levels?
>
> Tony
>
> On Jun 21, 2007, at 10:34 AM, Stephen Joyce wrote:
>
>> If you're using 2.1.x, then yes. All incrementals are based off of the full 
>> and will check/transfer those large directories again.
>> 
>> If you're using 3.x, then no. Only the incrementals that are the same or 
>> lower level numerically than the first incremental will check/transfer 
>> those large directories.
>> 
>> If the initial full is failing due to timeout issues, you can increase the 
>> timeout.
>> 
>>> From http://backuppc.sourceforge.net/faq/BackupPC.html
>> 
>> 
>> $Conf{ClientTimeout} = 72000;
>>
>>    Timeout in seconds when listening for the transport program's 
>> (smbclient, tar etc) stdout. If no output is received during this time, 
>> then it is assumed that something has wedged during a backup, and the 
>> backup is terminated.
>>
>>    Note that stdout buffering combined with huge files being backed up 
>> could cause longish delays in the output from smbclient that BackupPC_dump 
>> sees, so in rare cases you might want to increase this value.
>>
>>    Despite the name, this parameter sets the timeout for all transport 
>> methods (tar, smb etc).
>> 
>> Upgrading to 3.x is worthwhile.
>> 
>> Cheers, Stephen
>> --
>> Stephen Joyce
>> Systems Administrator                                            P A N I C
>> Physics & Astronomy Department                         Physics & Astronomy
>> University of North Carolina at Chapel Hill         Network Infrastructure
>> voice: (919) 962-7214                                        and Computing
>> fax: (919) 962-0480                               http://www.panic.unc.edu
>> 
>> Some people make the world turn and others just watch it spin.
>>   -- Jimmy Buffet
>> 
>> On Thu, 21 Jun 2007, Tony Schreiner wrote:
>> 
>>> I'm experiencing something and I just want to check if I'm
>>> understanding what is happening.
>>> 
>>> 
>>> I backup Linux-Linux with XferMethod =  rsync. Occasionally a full
>>> backup of one of my large systems fails for one reason or another. So
>>> I exclude some big directories and run it again, and then have the
>>> subsequent incrementals get the previously excluded directory.
>>> 
>>> However, and this is the part I'm not completely sure of, each
>>> subsequent incremental copies the originally missing files over and
>>> over and declares them pooled rather than same.
>>> 
>>> I thinks this is the way it is designed, but it is a problem in this
>>> case, because copying these large files from large directories has a
>>> strong performance impact on the client machine (which annoys my users).
>>> 
>>> Is this indeed what is happening? And is there anything I can do
>>> about it, short of running an unscheduled full  backup?
>>> 
>>> Tony Schreiner
>>> 
>>> -------------------------------------------------------------------------
>>> This SF.net email is sponsored by DB2 Express
>>> Download DB2 Express C - the FREE version of DB2 express and take
>>> control of your XML. No limits. Just data. Click to get it now.
>>> http://sourceforge.net/powerbar/db2/
>>> _______________________________________________
>>> BackupPC-users mailing list
>>> BackupPC-users@lists.sourceforge.net
>>> https://lists.sourceforge.net/lists/listinfo/backuppc-users
>>> http://backuppc.sourceforge.net/
>>> 
>>> 
>>> -- 
>>> 
>>> 
>
>
>
> -- 
>

-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to