Hi!

I was wondering if anyone could help me in finding a memory leak problem
with NumPy. My project is quite massive and I haven't been able to
construct a simple example which would reproduce the problem..

I have an iterative algorithm which should not increase the memory usage
as the iteration progresses. However, after the first iteration, 1GB of
memory is used and it steadily increases until at about 100-200
iterations 8GB is used and the program exits with MemoryError.

I have a collection of objects which contain large arrays. In each
iteration, the objects are updated in turns by re-computing the arrays
they contain. The number of arrays and their sizes are constant (do not
change during the iteration). So the memory usage should not increase,
and I'm a bit confused, how can the program run out of memory if it can
easily compute at least a few iterations..

I've tried to use Pympler, but I've understood that it doesn't show the
memory usage of NumPy arrays.. ?

I also tried gc.set_debug(gc.DEBUG_UNCOLLECTABLE) and then printing
gc.garbage at each iteration, but that doesn't show anything.

Does anyone have any ideas how to debug this kind of memory leak bug?
And how to find out whether the bug is in my code, NumPy or elsewhere?

Thanks for any help!
Jaakko
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to