Jason B wrote:
> However, the transfer always times out with signal=ALRM.
>   
[...]
> Somewhat unrelated, but of all these attempts, it hasn't ever kept a
> partial - so it transfers the files, fails, and removes them. I have
> one partial from 3 weeks ago that was miraculously kept, so it keeps
> coming back to it.
>
> Would anybody have any ideas on what I can do? I've set
> $Conf{ClientTimeout} = 7200; in the config.pl... enabled
> --checksum-seed... disabled compression to rsync... no other ideas.
> Running BackupPC-3.0.0 final. I'm guessing the connection gets broken
> at some point (using rsycnd), but is there any way to make BackupPC
> attempt to reconnect and just continue from where it left off?
>   

Not exactly.  It's a gripe that has come up before.  The way BackupPC 
works is by completing a job.  Anything incomplete is essentially thrown 
away the next time it runs.  You might try bumping up your ClientTimeout 
to a higher number, but chances are, you're actually seeing the pipe 
break because the connection is cut or TCP errors occur that prevent 
routing or who knows what.  If you think about it, larger transfers are 
much more susceptible to this because there may be a small chance that a 
connection is cut at any time, the longer the connection the more likely 
it breaks...  any unrecoverable transfer will tend toward impossible to 
complete as the tranfer time increases.  :-(

> On a final note: interestingly, backups from the SAME physical host
> using a different hostname (to back up another, much smaller,
> virtualhost directory) work perfectly every day, never failed. So I'm
> guessing it's just having a problem with the size / # of files. What
> can I do?
>
>
>   

I have a machine that has a lot of video (120gb) across a wifi WDS link 
(half 802.11g speed, at best).  I could never get an initial backup to 
succeed, because it could take 30-50 hours.  What I did was set up 
excludes on tons of directories, so the first backup was very short.  I 
kicked it off manually and waited until it completed.  Then I removed 
one excluded directory and kicked off another.  BackupPC skips files 
that have been entered into the pool due to a completed backup, so it is 
kind of like biting off smaller pieces of a single larger backup.  
Repeat until all your files have made it into the pool.   At that point, 
your total backups will be very short and only include deltas.

Other people have had success with moving your server physically to the 
LAN of the client and doing the backup over a fast, stable connection, 
to populate the pool with files initially.  That may not be an option 
for you.

Good luck,
JH

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to