Les Mikesell wrote at about 17:43:41 -0500 on Monday, August 8, 2011: > On 8/8/2011 5:33 PM, Jeffrey J. Kosowsky wrote: > > Les Mikesell wrote at about 16:59:45 -0500 on Monday, August 8, 2011: > > > On 8/8/2011 3:28 PM, Jeffrey J. Kosowsky wrote: > > > > > > > > > > > > FYI, On 32-bit Fedora 12/Linux 2.6.32: > > > > Ext2/3: MAX=32000 > > > > Ext4: MAX=65000 > > > > > > > > This presumably should be true more generally for any relatively > > > > non-ancient& unhacked version of Linux... > > > > > > Is exceeding that a fatal error? I thought it was just supposed add > > an > > > entry like a hash collision and go on. > > > > Well the link command in Holger's perl script fails and I imagine a > > similar bash script would fail too... > > I meant as far as backuppc is concerned. I thought it was fairly common > to have a bazillion copies of the same file scattered over your clients > and backuppc did something reasonable - but I'm not quite sure what. >
As I mentioned in my previous post, before creating a link BackupPC (and also my routines too btw) check to see if HardLinkMax is exceeded and if so then the pool element is duplicated (actually moved typically) and new elements are linked to it. This just adds an element to the md5sum chain. This all happens automatically and sensibly in BackupPC. ------------------------------------------------------------------------------ uberSVN's rich system and user administration capabilities and model configuration take the hassle out of deploying and managing Subversion and the tools developers use with it. Learn more about uberSVN and get a free download at: http://p.sf.net/sfu/wandisco-dev2dev _______________________________________________ BackupPC-users mailing list BackupPC-users@lists.sourceforge.net List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/