GATOUILLAT Pierre-Damien wrote: > Hi, > > Did you try the archive function of backuppc ? > http://backuppc.sourceforge.net/faq/BackupPC.html#archive_functions > > I'm using this function in a script (found in the archives of the list) to > create some tar.gz archive of my hosts, then put them on a tape for offsite > storage, with a cron job :
> /usr/share/backuppc/bin/BackupPC_archiveHost > /usr/share/backuppc/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2 $h > $LAST_FULL /bin/gzip .gz 0000000 /backup/tmp 0 "*" Yes, I wrote more or less a similar script. The problem with it is as follows. I use bzip2 -9 compression with BackupPC. So, if I use it to "archive" data for all hosts, it will do the following: 1) uncompress data stored with BackupPC 2) compress that same data again with bzip2 -9 So, effectively, it will do two unnecessary jobs. If one has many hosts, it will take a lot of time. I was wondering, if there is a simpler method - first, make a tar of these dirs: /backuppc-data/pc/host1/<latest> /backuppc-data/pc/host2/<latest> etc. This way we don't burn CPU cycles unnecessarily. The problem is, I'm not sure how can I restore such backup later on (well, I could use BackupPC_zcat manually on each and every file, but I'd rather avoid that). -- Tomasz Chmielewski http://wpkg.org ------------------------------------------------------------------------- Using Tomcat but need to do more? Need to support web services, security? Get stuff done quickly with pre-integrated technology to make your job easier Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642 _______________________________________________ BackupPC-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/backuppc-users http://backuppc.sourceforge.net/
