On Thu, Apr 14, 2005 at 12:25:40PM -0700, H. Peter Anvin wrote:
> >That may be true :-), but from the "front lines" I can report that
> >directories with > 32000 or > 65000 entries is *asking* for trouble. There
> >is a whole chain of systems that need to get things right for huge
> >directories to work well, and it often is not that way.
> >
> Specifics, please?

We've seen even Linus assume there is a 65K limit, and it appears more
people have been confused.

The systems I've seen mess this up include backup tools (quite serious ones
too), NetApp NFS servers, Samba shares and archivers.

Some tools just fail visibly, which is good, others become so slow as to
effectively lock up, which was the case with the backup tools. 

I've quite often been able to fix broken systems by hashing directories -
many problems just vanish. 

It is too easy to get into a O(N^2) situation. Git may be able to deal with
it but you may hurt yourself when making backups, or if you ever want to
share your tree (possibly with yourself) over the network.

But if you live in an all Linux world, and use mostly tar and rsync, it
should work.


http://www.PowerDNS.com      Open source, database driven DNS Software 
http://netherlabs.nl              Open and Closed source services
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [EMAIL PROTECTED]
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to