On Tue, 2005-12-20 at 17:00, Craig Barratt wrote:

> I have been experimenting with a perl script that generates a large
> tar file for copying the BackupPC data. 

Could you do one that rebuilds the hardlinks after the fact?  Then
you could copy one PC directory at a time, do the link step and
repeat, ending up with a reconstructed cpool.  If it kept track
of ctimes already processed, you could subsequently rsync new
backups into each PC tree and update the links, only processing
the new stuff.

-- 
  Les Mikesell
    [EMAIL PROTECTED]




-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://ads.osdn.com/?ad_id=7637&alloc_id=16865&op=click
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to