On Fri, Dec 16, 2011 at 4:42 AM, Jean Spirat <jean.spi...@squirk.org> wrote: > The issue is that we are now at 100GB of data > and 8.030.000 files so the backups takes 48H and more (to help the files > are on NFS share). I think i come to the point where file backup is at > it's limit.
What about a script on this machine with all the files that uses tar to put all (or some, or groups) these little files into a few bigger files, stored in a separate directory? Run your script a few times a day and just "exclude" the directories with gazillions of files and backup the directory you created that has the tar archives in them. Steve -- "The universe is probably littered with the one-planet graves of cultures which made the sensible economic decision that there's no good reason to go into space--each discovered, studied, and remembered by the ones who made the irrational decision." - Randall Munroe ------------------------------------------------------------------------------ Learn Windows Azure Live! Tuesday, Dec 13, 2011 Microsoft is holding a special Learn Windows Azure training event for developers. It will provide a great way to learn Windows Azure and what it provides. You can attend the event by watching it streamed LIVE online. Learn more at http://p.sf.net/sfu/ms-windowsazure _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/