--- Matt <[EMAIL PROTECTED]> wrote: > I'm working on tweaking a script I wrote a few years ago. I have a > little under 10,000 files in different directories that the script > generates index lists of. I'm trying to take any unnecessary load off > the server. > > my first tests I've done show a 350% speed increase by caching the dir > listings in a mysql database, over accessing the file system directly > to get the lists. I'm still working on these tests, and finding the > perfect way to do it, but what do you think?
Yes, if the directories don't change frequently then accessing the DB is a good way to go. How are you loading the table? And how often? If you are using a recursive function in PHP to do this, you might get even better results with a Unix/Linux find command. If you're using Windows then the PHP route might be the best way to go. James