On 6/26/2015 9:48 AM, az wrote:
Thanks Jean-Paul
You're right that I eat up a lot of memory with large files but I think
its not the whole story. If it were, my memory should come back each
time a new file is being read (jobs=[]), no ?
No. It's a feature of garbage collection: your memory
On Jun 27, 2015, at 6:05 AM, Dmitri Maziuk dmaz...@bmrb.wisc.edu wrote:
On 6/26/2015 9:48 AM, az wrote:
Thanks Jean-Paul
You're right that I eat up a lot of memory with large files but I think
its not the whole story. If it were, my memory should come back each
time a new file is being
I apologize that I haven't had a chance to look at this in detail yet, but
I can at least give a quick answer to the below:
Python uses a deterministic scheme for doing garbage collection based on
reference counting, so memory should be freed as soon as you do jobs=[].
That's assuming that the
On 6/27/2015 5:45 AM, Greg Landrum wrote:
...
That's assuming that the futures code (which I don't know) isn't doing
anything odd behind the scenes to hold onto references.
Or every mol in supplier holds a pointer to c++ dll that python vm
doesn't quite know how to garbage-collect, which keeps
4 matches
Mail list logo