Hi, Les Mikesell wrote on 2009-05-19 11:12:25 -0500 [Re: [BackupPC-users] backup the backuppc pool with bacula]: > [...] the newest version of rsync is supposed to handle the hardlinks more > efficiently.
reason suggests that this is an urban myth. The newest version of rsync handles *large file lists* better, not *hardlinks*. To handle hardlinks better, you would almost certainly need to create a temporary file (which can easily be several *GB* in size in our cases). I somehow doubt any general purpose tool would dare do that (let alone find a spot where it can - my /tmp simply isn't large enough). The issue is that the temporary file will either be very large or unneeded, and an algorithm able to handle extreme numbers of hardlinks will probably be slow in the overwhelming majority of cases with very few hardlinks. > You can always try that and see how long it takes. That you can. Just remember that your space usage (and hardlink counts) will grow over time. How does the time it takes grow in proportion to space and/or hardlink counts? At some random point it will stop working. You may never reach that point. But what if you do? Can you simply find another solution then? How long can you keep your pool offline for the copy process? At what point do you abort the copy process? How can you monitor its progress? Just some things to think about ... Regards, Holger ------------------------------------------------------------------------------ Crystal Reports - New Free Runtime and 30 Day Trial Check out the new simplified licensing option that enables unlimited royalty-free distribution of the report engine for externally facing server and web deployment. http://p.sf.net/sfu/businessobjects _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/