Correct.
On Tue, May 28, 2013 at 9:42 AM, zetah wrote:
> Albert Kottke wrote:
> >
> >I had this problem as well. I think my solution was to tell the
> >garbage collector to collect.
> >
> >import gc
> >import numpy as np
> >import matplotlib.pyplot as plt
> >
> >def draw_fig(arr, fn):
> >fi
Albert Kottke wrote:
>
>I had this problem as well. I think my solution was to tell the
>garbage collector to collect.
>
>import gc
>import numpy as np
>import matplotlib.pyplot as plt
>
>def draw_fig(arr, fn):
>fig = plt.figure()
>ax = fig.add_subplot(111)
>ax.contourf(arr)
>plt.s
I had this problem as well. I think my solution was to tell the garbage
collector to collect.
import gc
import numpy as np
import matplotlib.pyplot as plt
def draw_fig(arr, fn):
fig = plt.figure()
ax = fig.add_subplot(111)
ax.contourf(arr)
plt.savefig(fn)
plt.close(fig)
gc
"zetah" wrote:
>
>Eric Firing wrote:
>>
>>plt.close(fig) # that should take care of it
>
>Thanks for your quick reply.
>
>I tried before posting `plt.close()` and it didn't work, but also
>`plt.close(fig)` doesn't change memory pumping with every loop.
>BTW, I'm on Windows with Matplotlib
Eric Firing wrote:
>
>plt.close(fig) # that should take care of it
Thanks for your quick reply.
I tried before posting `plt.close()` and it didn't work, but also
`plt.close(fig)` doesn't change memory pumping with every loop.
BTW, I'm on Windows with Matplotlib 1.2.1
-
On 2013/05/27 9:51 PM, zetah wrote:
> Hi,
>
> if I use something like this:
>
> ==
> import numpy as np
> import matplotlib.pyplot as plt
>
> def draw_fig(arr, fn):
> fig = plt.figure()
> ax = fig.add_subplot(111)
> ax.contourf(arr)
>
Hi,
if I use something like this:
==
import numpy as np
import matplotlib.pyplot as plt
def draw_fig(arr, fn):
fig = plt.figure()
ax = fig.add_subplot(111)
ax.contourf(arr)
plt.savefig(fn)
if __name__ == '__main__':
for i i
On Sunday, March 13, 2011, onet wrote:
> On Fri, 2011-03-11 at 17:08 -1000, Eric Firing wrote:
>> On 03/11/2011 02:54 PM, onet wrote:
>> > Using matplotlib I try to plot satellite observations, which consists of
>> > roughly one million patches that are not gridded regularly.
>> > I first collect
On Fri, 2011-03-11 at 17:08 -1000, Eric Firing wrote:
> On 03/11/2011 02:54 PM, onet wrote:
> > Using matplotlib I try to plot satellite observations, which consists of
> > roughly one million patches that are not gridded regularly.
> > I first collect the vertices (corner points of the observation
On 03/11/2011 02:54 PM, onet wrote:
> Hi,
>
> Using matplotlib I try to plot satellite observations, which consists of
> roughly one million patches that are not gridded regularly.
> I first collect the vertices (corner points of the observations) and
> colors and then use PolyCollection and ax.add
Hi,
Using matplotlib I try to plot satellite observations, which consists of
roughly one million patches that are not gridded regularly.
I first collect the vertices (corner points of the observations) and
colors and then use PolyCollection and ax.add_collection to add these
patches to the figure
Thank you for your help. I upgraded to the latest development version,
and as you said, memory use dropped a ton. I will have to test more to
confirm that the problem is completely gone, but this appears to bring
memory usage down to something quite manageable (at least on my 8gb box
...).
T
Tom,
I just went through this, though with version 1.01 of mpl, so it may be
different. You can read the
very long thread at:
http://www.mail-archive.com/matplotlib-users@lists.sourceforge.net/msg20031.html
Those who maintain mpl don't think there is a memory leak. What I found was
that imsh
I am using matplotlib pylab in association with ipython -pylab to show
many large (~2000x2000 or larger) images. Each time I show another
image it consumes more memory until eventually exhausting all system
memory and making my whole system unresponsive.
The easiest way to replicate this behav
No problem. This caught me out a long time ago and has also caught out
a few people I know.
On Fri, Jan 14, 2011 at 8:23 PM, CASOLI Jules wrote:
> Hooo, well done! This is it.
>
> I didn't knew about caching...
> I was indeed using ipython, but I did led some test using the basic python
> interp
Hooo, well done! This is it.
I didn't knew about caching...
I was indeed using ipython, but I did led some test using the basic python
interpreter,with same results, so I did not mention this point.
In fact, python's basic interpreter still records the last three outputs. As my
tests were really
You're not doing this from ipython are you? It's cache hangs onto the
plot object references and stops python's garbage collector from
releasing them. If so, you can disable the cache as a workaround. A
better option would be if ipython implemented an option to avoid
caching references to matplotli
On Thu, Jan 13, 2011 at 7:54 AM, CASOLI Jules wrote:
> Hello to all,
>
> This is yet another question about matplotlib not freeing memory, when
> closing a figure (using close()).
> Here is what I'm doing (tried with several backends, on MacOSX and Linux,
> with similar results):
> --
Hello to all,
This is yet another question about matplotlib not freeing memory, when closing
a figure (using close()).
Here is what I'm doing (tried with several backends, on MacOSX and Linux, with
similar results):
import matplotlib as mpl
from matplotlib import pylot as pl
2010/8/16 Craig Lyndon :
> If data sets are indeed stored in RAM, is there a way to discard the
> plot data after a plot had been created to leave just a static image?
> Or, read and store data points directly from a file?
You can render to PIL using the Agg backend, and display this via
PIL.Image
Hi all,
I am using matplotlib to create a static graph from a few fairly large
data sets. (Over 500,000 samples)
I know you can pull out datasets of an axes by using the functions
get_xdata() and get_ydata(),
which leads me to wonder how matplotlib handles large data sets? Does
it keep the entire
Hi Michael,
Thanks for your explanation. It turns out that it is a combination of
(1) and (3). I hadn't thought about (1) and I hadn't done enough
playing to see the python interpreter releasing blocks of memory. As
you suggested, the "solution" is to limit the iPython cache by using
the iPython -
There are at least three possible causes of what you're seeing here:
1) ipython stores references to all results in the console. (ipython
maintains a history of results so they can easily be accessed later). I
don't recall the details, but it may be possible to turn this feature
off or limit
Is there a summary somewhere of the current state of knowledge about
memory leaks when using the pylab interface interactively? Doing
plot(rand(100)) or matshow(rand(1000,1000)) for example eats a big
chunk of memory (tried with TkAgg and WxAgg in Windows (mpl v0.98.5.2)
and Linux (mpl v0.9
24 matches
Mail list logo