> [snip]
> > Having a large number of files in a single directory does affect
> performance,
> > the degree of which depends on the filesystem.
> This is generally accepted wisdom for dealing with large numbers of files -
> but what number is considered "large"?
> Any rules of thumb, for different OpSys/file systems?
> [/snip]
> I guess the situation I have had (various OS's) and numbers of items in the
> file(s) (upwards of 40k items) I have never encountered a degredation in
> performance that could be deemed noticeable. I could perform some file
> retrieval analysis to see time lags, and I am sure that there would be some
> dependent upon the file system (and OS), not to mention hardware factors,
> the most significant which comes to mind is hard-drive seek times. But as
> yet, none noticed...

You could probably find info on this if you dug around in the
linux/freebsd source code... I think more important than how many files
can you fit in one directory before things start to suck is the issue of
that directory getting big enough (kilobyte wise) that it fills up the

If you put them in separate directories to start out with you can always
add a new disk (fileserver, etc.) and put half of the directories on the
new disk, mounted in the same place.  This way your code doesn't need to
change at all.


PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to