Hi there, Amazon offers amongst other services one named S3* (Simple Storage Service, moderate price with low latency) and Glacier* (extremely cheap storage, retrieval can take hours, perfect for backups only needed when disaster strikes).
With the correct config rules in place, files uploaded to S3 can be moved to Glacier automatically, e.g. when the file's age is >= 14 days. I'm curious if anybody managed to use Amazon S3 or Glacier with BackupPC, e.g. as an additional safeguard against RAID failure or file system corruption? I'm not sure if uploading each and every single file to S3 is the right way to do this, performance-wise. With BackupPC's feature BackupPC_archiveStart one could let BackupPC generate an archive of the most recent backup and send this to S3 instead of many litte files. Did anyone try this in any way? Any suggestions on how to implement this with BackupPC? Thanks for your feedback Marcel. * https://en.wikipedia.org/wiki/Amazon_S3 * https://en.wikipedia.org/wiki/Amazon_Glacier ------------------------------------------------------------------------------ Transform Data into Opportunity. Accelerate data analysis in your applications with Intel Data Analytics Acceleration Library. Click to learn more. http://pubads.g.doubleclick.net/gampad/clk?id=278785231&iu=/4140 _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/