On 2/16/06, John Marshall <[EMAIL PROTECTED]> wrote: > Hi, > > Should I expect the virtual memory allocation > to go up if I do the following? > ----- > raw = open("data").read() > while True: > d = eval(raw) > ----- > > I would have expected the memory allocated to the > object referenced by d to be deallocated, garbage > collected, and reallocated for the new eval(raw) > results, assigned to d. > > The file contains a large, SIMPLE (no self refs; all > native python types/objects) dictionary (>300K).
You're probably running into the problem that the concrete parse tree built up by the parser is rather large. While the memory used for that tree is freed to Python's malloc pool, thus making it available for other allocations by the same process, it is likely that the VM allocation for the process will permanently go up. When I try something like this (*) I see the virtual memory size go up indefinitely with Python 2.3.5, but not with Python 2.4.1 or 2.5(head). Even so, the problem may be fragmentation instead of a memory leak; fragmentation problems are even harded to debug than leaks (since they depend on the heuristics applied by the platform's malloc implementation). You can file a bug for 2.3 but unless you also provide a patch it's unlikely to be fixed; the memory allocation code was revamped significantly for 2.4 so there's no simple backport of the fix available. (*) d = {} for i in range(100000): d[repr(i)] = i s = str(d) while 1: x = eval(s); print 'x' -- --Guido van Rossum (home page: http://www.python.org/~guido/) _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com