On Thu, May 01, 2008, "Martin v. L?wis" wrote:
> Guido van Rossum wrote:
>>
>> There is one use case I can see for an iterator-version of
>> os.listdir() (to be named os.opendir()): when globbing a huge
>> directory looking for a certain pattern. Using os.listdir() you end up
>> needed enough memory to hold all of the names at once. Using
>> os.opendir() you would need only enough memory to hold all of the
>> names THAT MATCH.
> 
> You would still have to read the entire directory, right?  In that
> kind of class, there is a number of applications; e.g. du(1) also
> wouldn't have to create a list of all files in the directory, but add
> the sizes of the files incrementally.

Actually, the primary application I'm thinking of is a CGI that displays
part of a directory listing (paged) for manual processing of individual
files.

> So the question really is whether it is a problem to keep all file
> names in memory simultaneously. As Aahz says, the total memory
> consumption for a large directory is still comparatively low, for
> today's machines.

Only for a single process.  Throw together three or ten processes, and
it adds up.  As I said, not a huge problem, but defintely the potential
for pain.
-- 
Aahz ([EMAIL PROTECTED])           <*>         http://www.pythoncraft.com/

Help a hearing-impaired person: http://rule6.info/hearing.html
_______________________________________________
Python-3000 mailing list
Python-3000@python.org
http://mail.python.org/mailman/listinfo/python-3000
Unsubscribe: 
http://mail.python.org/mailman/options/python-3000/archive%40mail-archive.com

Reply via email to