Robert Elsner added the comment:
Well then at least the docs need an update. I simply fail to see how a
cache memory leak constitutes "just fine" (while the caching behavior of
struct.unpack is not documented - if somebody wants caching, he ought to
use struct.Struct.unpack which does
Robert Elsner added the comment:
Well I stumbled across this leak while reading big files. And what is
the point of having a fast C-level unpack when it can not be used with
big files?
I am not adverse to the idea of caching the format string but if the
cache grows beyond a reasonable size, it
Robert Elsner added the comment:
Well the problem is, that performance is severely degraded when calling unpack
multiple times. I do not know in advance the size of the files and they might
vary in size from 1M to 1G. I could use some fixed-size buffer which is
inefficient depending on the
Robert Elsner added the comment:
Well seems like 3.1 is in the Debian repos as well. Same memory leak. So it is
very unlikely it has been fixed in 2.7. I modified the test case to be
compatible to 3.1 and 2.6.
--
versions: +Python 3.1
Added file: http://bugs.python.org/file25239
Robert Elsner added the comment:
I would love to test but I am in a production environment atm and can't really
spare the time to set up a test box. But maybe somebody with access to 2.7 on
linux could test it with the supplied script (just start it and it should
happily eat 8GB of m
New submission from Robert Elsner :
When unpacking multiple files with _variable_ length, struct unpack leaks
massive amounts of memory. The corresponding functions from numpy (fromfile) or
the array (fromfile) standard lib module behave as expected.
I prepared a minimal testcase illustrating