hi! thank you for the tip. I changed my skript to use 'BackupPC_tarCreate' in a loop for each host. the backup number gets dynamically extracted from the last file "backupInfo" located under "BackupPC/pc/<hostname>/<backup number>/backupInfo", thats type is "full". that works for the moment and will go on this weekend as the first run will be completed.
but now there are no backups visible for the archive_host at the webfrontend, because the information for this host doesn't get updated. is there a possibility to update the info for the archive_host, so that you can see if the last archive backup was successful for all hosts? kind regards markus fröhlich Am 22.09.2011 19:45, schrieb Les Mikesell: > 2011/9/22 Markus Fröhlich<[email protected]>: >> Writing tar archive for host hosting1, backup #24, split to output files >> /SATA/BackupPC/archive/hosting1.24.tar.gz.* >> exiting after signal ALRM >> Archive failed: aborted by signal=ALRM >> >> the strange thing is, that the duration shows ~1200 min on both >> backupPC servers for the archive job, when it fails. >> is there a limitation anywhere in the code? > See $Conf{ClientTimeout} (Under Backup Setting in the web editor). > > But if you are driving things with a cron job and want better control, > why not just run BackupPC_tarCreate in your own script? I think the > 'archive host' concept just exists to give a web link you can click. > ------------------------------------------------------------------------------ All of the data generated in your IT infrastructure is seriously valuable. Why? It contains a definitive record of application performance, security threats, fraudulent activity, and more. Splunk takes this data and makes sense of it. IT sense. And common sense. http://p.sf.net/sfu/splunk-d2dcopy2 _______________________________________________ BackupPC-users mailing list [email protected] List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
