On 02/03/2011 11:30 AM, Robert Abiad wrote:
On 2/3/2011 10:06 AM, Eric Firing wrote:
On 02/02/2011 10:17 PM, Eric Firing wrote:
On 02/02/2011 08:38 PM, Robert Abiad wrote:
[...]
I'll put it in as an enhancement, but I'm still unsure if there is a
bug in
there as well. Is there something I should be doing to clear memory
after the
first figure is closed other than close()? I don't understand why
memory usage
grows each time I replot, but I'm pretty sure it isn't desireable
behavior. As
I mentioned, this effect is worse with plot.
So is this a bug or improper usage?
I'm not quite sure, but I don't think there is a specifically matplotlib
memory leak bug at work here. Are you using ipython, and if so, have you
turned off the caching? In its default mode, ipython keeps lots of
references, thereby keeping memory in use. Also, memory management and
reporting can be a bit tricky and misleading.
Nevertheless, the attached script may be illustrating the problem. Try
running it from the command line as-is (maybe shorten the loop--it
doesn't take 100 iterations to show the pattern) and then commenting out
the line as indicated in the comment. It seems that if anything is done
that adds ever so slightly to memory use while the figure is displayed,
then when the figure is closed, its memory is not reused. I'm puzzled.
I wasn't thinking straight--there is no mystery and no memory leak.
Ignore my example script referred to above. It was saving rows of the z
array, not single elements as I had intended, so of course memory use
was growing substantially.
Eric
You may not see a memory leak, but I still can't get my memory back without
killing python. I
turned off the ipython caching and even ran without iPython on both Windows and
Ubuntu, but when I
use imshow(), followed by close('all') and another imshow(), I run out of
memory. I can see from
the OS that the memory does not come back after close() and that it grows after
the second imshow().
Any other ideas? Looks like a bug to me otherwise.
Except that I tried the same things and did not get quite the same
result. Let's track this down. Please try the attached script, and see
if the memory usage grows substantially, or just oscillates a bit.
Eric
import time
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.cbook import report_memory
plt.ion()
mem = []
for i in range(20):
z = np.random.rand(2000, 2000)
plt.imshow(z)
#plt.draw()
#time.sleep(0.5)
plt.savefig("test.png")
plt.close('all')
m = report_memory()
mem.append(m)
print m
------------------------------------------------------------------------------
The modern datacenter depends on network connectivity to access resources
and provide services. The best practices for maximizing a physical server's
connectivity to a physical network are well understood - see how these
rules translate into the virtual world?
http://p.sf.net/sfu/oracle-sfdevnlfb
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users