Hi all,

Apologies for the relatively long email, but I figure it's better to
give too much information than not enough. I've run into a bit of
difficulty backing up a large directory tree that has me not being
able to do a successful backup in over a month now. I'm attempting to
back up about 70GB over the Internet with a 1 MB/sec connection (the
time it takes doesn't really bother me, just want to do a full backup
and then run incrementals all  the time). However, the transfer always
times out with signal=ALRM.

The tree is approximately like this:

- top level 1
- articles
  - dir 1
    - subdirs 1 through 9
  - dir 2
    - subdirs 1 through 9
  etc until dir 9 (same subdir structure)
- images
  - dir 1
    - subdirs 1 through 9
  - dir 2
    - subdirs 1 through 9
  etc until dir 9 (same subdir structure)
- top level 4

There are (on average) 5,000 files per directory (about 230,000 files
in total).

I tried rsyncing the directory, that timed out somewhat early on in
the process (usually with aborted by signal=ALRM). I tried setting up
rsyncd, and that got a bit further - i got 50GB done, then it failed
the same way. Most recently, I tried creating rsyncd shares for EACH
top-level directory (so I would have 9 rsycnd shares) and it went to
about 45'ish GB, then it failed again (same message) - and it didn't
keep a partial backup, so it just removed the /new directory and now I
have to restart all the way from the beginning. Over the last couple
of weeks, I've had varying degrees of success, with the worst being
3GB done and the best around 50GB (as mentioned above).

Somewhat unrelated, but of all these attempts, it hasn't ever kept a
partial - so it transfers the files, fails, and removes them. I have
one partial from 3 weeks ago that was miraculously kept, so it keeps
coming back to it.

Would anybody have any ideas on what I can do? I've set
$Conf{ClientTimeout} = 7200; in the config.pl... enabled
--checksum-seed... disabled compression to rsync... no other ideas.
Running BackupPC-3.0.0 final. I'm guessing the connection gets broken
at some point (using rsycnd), but is there any way to make BackupPC
attempt to reconnect and just continue from where it left off?

On a final note: interestingly, backups from the SAME physical host
using a different hostname (to back up another, much smaller,
virtualhost directory) work perfectly every day, never failed. So I'm
guessing it's just having a problem with the size / # of files. What
can I do?

Finally - I did send one or two emails aboout this to the Sourceforge list 
address, but they never showed up neither on the mailing list, nor on the 
Sourceforge archive. I apologise if it ever came twice!

Any thoughts welcome. Thanks very much!

- Jason


-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to