On Fri, 2005-11-11 at 04:08, Nicolai Rasmussen wrote:
> I was under the impression, that CP was sufficient to make offsite backup 
> copies, of backuppc's repository (/var/lib/backuppc/).

If you have the GNU version of cp (all linux distributions should) you
can use cp -a, but it will take a very long time to duplicate all the
hardlinks.

> I want to be able to take the latest full backup for each host (we have 10 
> hosts, max) onto an USB device
> and move it offsite (each week).

If you just need a full, you can either set up the 'archive host'
feature to dump to a directory on the external drive or run the 
BackupPC_tarCreate program to make a tar image.  I'd probably use the
command line program in a script that dumps each to a
compressed tar file.   It will construct what looks like a
full tar image with the latest incrementals merged automatically.

> My xfermethod is rsync btw, should I then use rsync manually to make a copy 
> of the files, or how would I go about doing that?

In theory 'rsync -aH' could copy the archive area, but it will likely be
too slow to be practical due to all the hardlinks.

-- 
  Les Mikesell
    lesmikesell



-------------------------------------------------------
SF.Net email is sponsored by:
Tame your development challenges with Apache's Geronimo App Server. Download
it for free - -and be entered to win a 42" plasma tv or your very own
Sony(tm)PSP.  Click here to play: http://sourceforge.net/geronimo.php
_______________________________________________
BackupPC-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to