Hi,

In an application that updates a plot with
new experimental data, say, every second and the experiment
can last hours, I have tried two approaches:
1) clear axes and plot new experimental data - this is
slow and takes too much cpu resources.
2) remove lines and plot new experimental data - this is
fast enough but unfortunately there seems to be a memory
leakage, the application runs out of memory.

Here follows a simple script that demonstrates the
leakage problem:

#
import numpy
from numpy.testing.utils import memusage
import matplotlib.pyplot as plt
x = range (1000)
axes1 = plt.figure().add_subplot( 111 )
y = numpy.random.rand (len (x))
while 1:
    if 1:
        # leakage
        for line in axes1.lines:
            if line.get_label ()=='data':
                line.remove()
    else:
        # no leak, but slow
        axes1.clear()
    axes1.plot(x, y, 'b', label='data')
    print memusage (), len (axes1.lines)
#eof

When running the script, the memory usage
is increasing by 132 kbytes per iteration, that is,
with an hour this example application will consume
464MB RAM while no new data has been generated. In real
application this effect will be even worse.

So, I am looking for an advice how to avoid
this memory leakage without clearing axes.

I am using matplotlib from SVN.

Thanks,
Pearu


------------------------------------------------------------------------------

_______________________________________________
Matplotlib-devel mailing list
Matplotlib-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-devel

Reply via email to