On Fri, May 28, 2010 at 3:18 AM, Pearu Peterson <pe...@cens.ioc.ee> wrote:
>
> Hi,
>
> In an application that updates a plot with
> new experimental data, say, every second and the experiment
> can last hours, I have tried two approaches:
> 1) clear axes and plot new experimental data - this is
> slow and takes too much cpu resources.
> 2) remove lines and plot new experimental data - this is
> fast enough but unfortunately there seems to be a memory
> leakage, the application runs out of memory.
>
> Here follows a simple script that demonstrates the
> leakage problem:
>
> #
> import numpy
> from numpy.testing.utils import memusage
> import matplotlib.pyplot as plt
> x = range (1000)
> axes1 = plt.figure().add_subplot( 111 )
> y = numpy.random.rand (len (x))
> while 1:
>    if 1:
>        # leakage
>        for line in axes1.lines:
>            if line.get_label ()=='data':
>                line.remove()
>    else:
>        # no leak, but slow
>        axes1.clear()
>    axes1.plot(x, y, 'b', label='data')
>    print memusage (), len (axes1.lines)
> #eof
>
> When running the script, the memory usage
> is increasing by 132 kbytes per iteration, that is,
> with an hour this example application will consume
> 464MB RAM while no new data has been generated. In real
> application this effect will be even worse.
>
> So, I am looking for an advice how to avoid
> this memory leakage without clearing axes.

Hey Pearu -- thanks for the report.  We'll try and track down and fix
this leak.  In the interim, would an acceptable work around for you be
to *reuse* an existing line by calling set_data on it.  That way you
wouldn't have to do the add/remove that is causing your leak.  Have
you confirmed this leak on various backends (eg Agg, PDF, PS)?

------------------------------------------------------------------------------

_______________________________________________
Matplotlib-devel mailing list
Matplotlib-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-devel

Reply via email to