Guido van Rossum wrote: > There is one use case I can see for an iterator-version of > os.listdir() (to be named os.opendir()): when globbing a huge > directory looking for a certain pattern. Using os.listdir() you end up > needed enough memory to hold all of the names at once. Using > os.opendir() you would need only enough memory to hold all of the > names THAT MATCH.
You would still have to read the entire directory, right? In that kind of class, there is a number of applications; e.g. du(1) also wouldn't have to create a list of all files in the directory, but add the sizes of the files incrementally. So the question really is whether it is a problem to keep all file names in memory simultaneously. As Aahz says, the total memory consumption for a large directory is still comparatively low, for today's machines. Regards, Martin _______________________________________________ Python-3000 mailing list Python-3000@python.org http://mail.python.org/mailman/listinfo/python-3000 Unsubscribe: http://mail.python.org/mailman/options/python-3000/archive%40mail-archive.com