Hi,

On Tue, 2005-12-20 at 17:23 -0600, Les Mikesell wrote:
> On Tue, 2005-12-20 at 17:00, Craig Barratt wrote:
> 
> > I have been experimenting with a perl script that generates a large
> > tar file for copying the BackupPC data. 
> 
> Could you do one that rebuilds the hardlinks after the fact?  Then
> you could copy one PC directory at a time, do the link step and
> repeat, ending up with a reconstructed cpool.  If it kept track

A colleague of mine wrote just that script. I had the problem of needing
to migrate my backuppc with all data to another server and ran into the
whole hardlink / memory issue. So my colleague wrote a script that
rsyncs the /pc data machine for machine and then after each machine
reconstructs the cpool. It takes a while on large installs, but works
like a charm, used it 3 times so far. I'll check if I can send it to
you.

Regards,

-- 
Guus Houtzager                           Email: [EMAIL PROTECTED]
PGP fingerprint = 5E E6 96 35 F0 64 34 14  CC 03 2B 36 71 FB 4B 5D
Early to rise, early to bed, makes a man healthy, wealthy and dead.
        --Rincewind, The Light Fantastic



-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://ads.osdn.com/?ad_id=7637&alloc_id=16865&op=click
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to