Re: [Matplotlib-users] Memory usage when plotting sequental figures

2013-05-28 Thread Eric Firing
On 2013/05/27 9:51 PM, zetah wrote:
 Hi,

 if I use something like this:

 ==
 import numpy as np
 import matplotlib.pyplot as plt

 def draw_fig(arr, fn):
  fig = plt.figure()
  ax = fig.add_subplot(111)
  ax.contourf(arr)
  plt.savefig(fn)

plt.close(fig)  # that should take care of it


 if __name__ == '__main__':
   for i in range(10):
   draw_fig(np.random.random((10, 10)), 'fig_%02d.png' % i)
 ==

 memory usage grows with every loop, so I can't plot this way many sequences.

 I know there is animation class in Matplotlib, but this way is easier to me, 
 and I think I miss something fundamental because this is happening. How can I 
 avoid memory leak using this approach?

 Thanks


 --
 Try New Relic Now  We'll Send You this Cool Shirt
 New Relic is the only SaaS-based application performance monitoring service
 that delivers powerful full stack analytics. Optimize and monitor your
 browser, app,  servers with just a few lines of code. Try New Relic
 and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may
 ___
 Matplotlib-users mailing list
 Matplotlib-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/matplotlib-users



--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage when plotting sequental figures

2013-05-28 Thread Albert Kottke
Correct.


On Tue, May 28, 2013 at 9:42 AM, zetah ot...@hush.ai wrote:

 Albert Kottke wrote:
 
 I had this problem as well. I think my solution was to tell the
 garbage collector to collect.
 
 import gc
 import numpy as np
 import matplotlib.pyplot as plt
 
 def draw_fig(arr, fn):
 fig = plt.figure()
 ax = fig.add_subplot(111)
 ax.contourf(arr)
 plt.savefig(fn)
 plt.close(fig)
 gc.collect()
 
 I tried to test this with Python3.3, but didn't have any issues
 with memory increasing when using 'plt.close'.


 Thanks Albert, that indeed does the trick :)

 If I understand your last sentence, you are saying garbage collector
 intervention isn't needed for Python 3.3.


 Cheers


--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage with 1M of Polygons

2011-03-13 Thread onet
On Fri, 2011-03-11 at 17:08 -1000, Eric Firing wrote:
 On 03/11/2011 02:54 PM, onet wrote:
  Using matplotlib I try to plot satellite observations, which consists of
  roughly one million patches that are not gridded regularly.
  I first collect the vertices (corner points of the observations) and
  colors and then use PolyCollection and ax.add_collection to add these
  patches to the figure.
 
  On my 64bit Linux machine:
  # 1M patches will use  4Gb of memory
 
  My question: how can I plot more efficiently and use less memory?
 
 If your data are on a quadrilateral mesh, as in your example, (or can be 
 approximately mapped onto such a mesh) then pcolormesh should be very 
 much more efficient both in time and in memory than making a PolyCollection.

The data I want to plot is not as regular as in the example (this was
just to generate lots of non-overlaping patches) but it has different
shapes along the orbit of the satellite when projected on the map.
Almost square at the equator and rotated near the poles. See example
link below from a plot in IDL. 

http://temis.nl/o3msaf/vaac/gome2/vaac/daily/images/2011/S-O3M_GOME_NAR_02_M02_20110312000254Z_20110313000254Z_N_O_20110313024518Z.AAI_Global.Unfiltered.png

But I think my satellite data along an orbit is probably piecewise
regular enough try the pcolormesh approach. 

So thanks for the suggestion!

Best regards,

Olaf.


--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage with 1M of Polygons

2011-03-13 Thread Benjamin Root
On Sunday, March 13, 2011, onet o...@dds.nl wrote:
 On Fri, 2011-03-11 at 17:08 -1000, Eric Firing wrote:
 On 03/11/2011 02:54 PM, onet wrote:
  Using matplotlib I try to plot satellite observations, which consists of
  roughly one million patches that are not gridded regularly.
  I first collect the vertices (corner points of the observations) and
  colors and then use PolyCollection and ax.add_collection to add these
  patches to the figure.
 
  On my 64bit Linux machine:
  # 1M patches will use  4Gb of memory
 
  My question: how can I plot more efficiently and use less memory?

 If your data are on a quadrilateral mesh, as in your example, (or can be
 approximately mapped onto such a mesh) then pcolormesh should be very
 much more efficient both in time and in memory than making a PolyCollection.

 The data I want to plot is not as regular as in the example (this was
 just to generate lots of non-overlaping patches) but it has different
 shapes along the orbit of the satellite when projected on the map.
 Almost square at the equator and rotated near the poles. See example
 link below from a plot in IDL.

 http://temis.nl/o3msaf/vaac/gome2/vaac/daily/images/2011/S-O3M_GOME_NAR_02_M02_20110312000254Z_20110313000254Z_N_O_20110313024518Z.AAI_Global.Unfiltered.png

 But I think my satellite data along an orbit is probably piecewise
 regular enough try the pcolormesh approach.

 So thanks for the suggestion!

 Best regards,

         Olaf.


It might be regular with respect to a particular projection.  Have you
considered checking out Basemap?

Ben Root

--
Colocation vs. Managed Hosting
A question and answer guide to determining the best fit
for your organization - today and in the future.
http://p.sf.net/sfu/internap-sfd2d
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage with 1M of Polygons

2011-03-11 Thread Eric Firing
On 03/11/2011 02:54 PM, onet wrote:
 Hi,

 Using matplotlib I try to plot satellite observations, which consists of
 roughly one million patches that are not gridded regularly.
 I first collect the vertices (corner points of the observations) and
 colors and then use PolyCollection and ax.add_collection to add these
 patches to the figure.

 On my 64bit Linux machine:
 # 518400 patches will use2Gb of memory
 # 1M patches will use  4Gb of memory
 On a 32bit machine the memory is roughly half compared to 64bit.

 My question: how can I plot more efficiently and use less memory?

If your data are on a quadrilateral mesh, as in your example, (or can be 
approximately mapped onto such a mesh) then pcolormesh should be very 
much more efficient both in time and in memory than making a PolyCollection.

Eric

 An example script with random data is below.

 System:
 Debian Testing: kernel 2.6.32-5-amd64 x86_64 GNU/Linux
 Python 2.6.6 (r266:84292, Dec 26 2010, 22:31:48)
 matplotlib.__version__ = '1.0.0'

 Best regards,

   Onet.



 #!/usr/bin/env python
 #
 # Purpose : Show large use of memory when plotting
 #   large numbers of patches
 #

 import random

 import matplotlib
 matplotlib.use('AGG') # produce AGG graphics (o.a. PNG) by default
 import matplotlib.pyplot as plt
 import matplotlib.colors as colors
 from matplotlib.collections import PolyCollection
 from mpl_toolkits.basemap import Basemap


 def test_polycollection (NLats, NLons):
   Test poly collections 

  fig = plt.figure()
  ax = fig.add_axes([0.1,0.1,0.8,0.8])
  figmap = Basemap (llcrnrlon=-180., llcrnrlat=-90,
urcrnrlon=180., urcrnrlat=90.,\
resolution='c', area_thresh=1.,
projection='cyl')

  # Color map and min/max bounds
  cmap = plt.cm.jet
  vmin = 0
  vmax = 10

  # Arrays for the vertices and the colors
  Poly_Vertices = []
  Poly_Colors = []

  # add pixel to array of vertices and set a random color
  for LatNr in range (0, NLats):
  for LonNr in range (0, NLons, 2):

  # shift lon 1 point if odd for staggered grid
  if (LatNr % 2 == 0):
  # even
  ShiftLon = 0
  else:
  # odd
  ShiftLon = 1

  # calc coordinates for vertex storage
  x1,y1 = (359.*(ShiftLon + LonNr + 1)/(NLons+1) - 179.9,
   179.*(LatNr)/(NLats+1) - 89.9)
  x2,y2 = (359.*(ShiftLon + LonNr + 2)/(NLons+1) - 179.9,
   179.*(LatNr + 1)/(NLats+1) - 89.9)
  x3,y3 = (359.*(ShiftLon + LonNr + 1)/(NLons+1) - 179.9,
   179.*(LatNr + 2)/(NLats+1) - 89.9)
  x4,y4 = (359.*(ShiftLon + LonNr)/(NLons+1) - 179.9,
   179.*(LatNr + 1)/(NLats+1) - 89.9)

  # get RGB colors, cut off alpha.
  RandomValue = random.random() * vmax
  colorsgen = cmap ((RandomValue - vmin) / (vmax - vmin))[:3]

  # add the polygon vertices and the color to the array
  Poly_Vertices.append ([(x1,y1), (x2,y2), (x3,y3), (x4,y4)])
  Poly_Colors.append (colorsgen)


  # Create PolyCollection and add it to the axes
  print 'PolyCollection: number of elements: ', len (Poly_Colors)
  Data_PatchCollection = PolyCollection (Poly_Vertices,
 facecolor=Poly_Colors,
 edgecolor='black',
 linewidth=0)

  print 'add_collection'
  ax.add_collection (Data_PatchCollection)

  print 'add_collection done'

  # finish the plot by drawing coastlines
  figmap.drawcoastlines()

  plt.title ('PolyCollection on a map')
  fig.savefig ('polycol.png', dpi=300)
  plt.close()

  return
 #
 # End test_polycollection
 #


 if __name__ == __main__:
   Test the memory size of matplotlib using poly
  collections.

  On a 64 bit linux machine the memory use is
  enormous when plotting large numbers of patches
  via matplotlib / PolyCollection.

  For 518400 patches matplotlib will use ~2Gb of
  memory. On a 32 bit Linux machine, the memory
  usage is roughly half.

  Can this be done more efficient?

  Debian Testing: Linux host 2.6.32-5-amd64 #1 SMP
Wed Jan 12 03:40:32 UTC 2011 x86_64 GNU/Linux
  Python 2.6.6 (r266:84292, Dec 26 2010, 22:31:48)
  matplotlib.__version__ = '1.0.0'
  

  # 129600 patches will use ~630Mb of memory (on 64bit Linux)
  #NLats=360
  #NLons=720

  # 259200 patches will use ~1Gb of memory (on 64bit Linux)
  #NLats=360
  #NLons=1440

  # 518400 patches will use ~2Gb of memory (on 64bit Linux)
  NLats=720
  NLons=1440

  #
  # test the 

Re: [Matplotlib-users] memory usage with repeated imshow

2011-02-14 Thread Tom Dimiduk
Thank you for your help.  I upgraded to the latest development version, 
and as you said, memory use dropped a ton.  I will have to test more to 
confirm that the problem is completely gone, but this appears to bring 
memory usage down to something quite manageable (at least on my 8gb box 
...).

Tom

On 02/09/2011 07:30 PM, Robert Abiad wrote:
 Tom,

 I just went through this, though with version 1.01 of mpl, so it may be 
 different.  You can read the
 very long thread at:

 http://www.mail-archive.com/matplotlib-users@lists.sourceforge.net/msg20031.html

 Those who maintain mpl don't think there is a memory leak. What I found was 
 that imshow() does
 consume a lot of memory (now fixed in the development version) and that the 
 first 2 or so uses build
 on each other, but after that it levels off giving back memory after close(). 
  There is a
 discrepancy between what python reports it's using and what the OS reports (I 
 had 500MB from the OS,
 but only 150MB from python).  There is a chance that ipython is caching your 
 results (try ipython
 -pylab -cs 0), but when I ran without ipython, python still had a large 
 portion of memory.

 -robert

 On 2/9/2011 3:52 PM, Tom Dimiduk wrote:
 I am using matplotlib pylab in association with ipython -pylab to show
 many large (~2000x2000 or larger) images.  Each time I show another
 image it consumes more memory until eventually exhausting all system
 memory and making my whole system unresponsive.

 The easiest way to replicate this behaviour is with
 a = ones((,))
 imshow(a)

 optionally

 close()

 and then

 imshow(a)

 again.  I am using ipython .10.1 and matplotlib 0.99.3.  Is there
 something I should be doing differently to avoid this problem?  Is it
 fixed in a later version?

 Thanks,
 Tom

 --
 The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE:
 Pinpoint memory and threading errors before they happen.
 Find and fix more than 250 security defects in the development cycle.
 Locate bottlenecks in serial and parallel code that limit performance.
 http://p.sf.net/sfu/intel-dev2devfeb
 ___
 Matplotlib-users mailing list
 Matplotlib-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/matplotlib-users

 --
 The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE:
 Pinpoint memory and threading errors before they happen.
 Find and fix more than 250 security defects in the development cycle.
 Locate bottlenecks in serial and parallel code that limit performance.
 http://p.sf.net/sfu/intel-dev2devfeb
 ___
 Matplotlib-users mailing list
 Matplotlib-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/matplotlib-users

--
The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE:
Pinpoint memory and threading errors before they happen.
Find and fix more than 250 security defects in the development cycle.
Locate bottlenecks in serial and parallel code that limit performance.
http://p.sf.net/sfu/intel-dev2devfeb
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] memory usage with repeated imshow

2011-02-09 Thread Robert Abiad
Tom,

I just went through this, though with version 1.01 of mpl, so it may be 
different.  You can read the 
very long thread at:

http://www.mail-archive.com/matplotlib-users@lists.sourceforge.net/msg20031.html

Those who maintain mpl don't think there is a memory leak. What I found was 
that imshow() does 
consume a lot of memory (now fixed in the development version) and that the 
first 2 or so uses build 
on each other, but after that it levels off giving back memory after close().  
There is a 
discrepancy between what python reports it's using and what the OS reports (I 
had 500MB from the OS, 
but only 150MB from python).  There is a chance that ipython is caching your 
results (try ipython 
-pylab -cs 0), but when I ran without ipython, python still had a large portion 
of memory.

-robert

On 2/9/2011 3:52 PM, Tom Dimiduk wrote:
 I am using matplotlib pylab in association with ipython -pylab to show
 many large (~2000x2000 or larger) images.  Each time I show another
 image it consumes more memory until eventually exhausting all system
 memory and making my whole system unresponsive.

 The easiest way to replicate this behaviour is with
 a = ones((,))
 imshow(a)

 optionally

 close()

 and then

 imshow(a)

 again.  I am using ipython .10.1 and matplotlib 0.99.3.  Is there
 something I should be doing differently to avoid this problem?  Is it
 fixed in a later version?

 Thanks,
 Tom

 --
 The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE:
 Pinpoint memory and threading errors before they happen.
 Find and fix more than 250 security defects in the development cycle.
 Locate bottlenecks in serial and parallel code that limit performance.
 http://p.sf.net/sfu/intel-dev2devfeb
 ___
 Matplotlib-users mailing list
 Matplotlib-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/matplotlib-users

--
The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE:
Pinpoint memory and threading errors before they happen.
Find and fix more than 250 security defects in the development cycle.
Locate bottlenecks in serial and parallel code that limit performance.
http://p.sf.net/sfu/intel-dev2devfeb
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage

2011-01-15 Thread gary ruben
No problem. This caught me out a long time ago and has also caught out
a few people I know.

On Fri, Jan 14, 2011 at 8:23 PM, CASOLI Jules jules.cas...@cea.fr wrote:
 Hooo, well done! This is it.

 I didn't knew about caching...
 I was indeed using ipython, but I did led some test using the basic python 
 interpreter,with same results, so I did not mention this point.
 In fact, python's basic interpreter still records the last three outputs. As 
 my tests were really short (plt.close() ; mpl.cbook.report_memory() ; 
 gc.collect() is only two lines before the collect, only o)ne o,f theme 
 outputt ing something) even pyhton's caching was still at work, and the 
 garbage collector could not free anything.

 Thanks a lot, and also thanks to Ben for taking interest !

 Jules

 PS : Gary, sorry, for the duplicated mail...

--
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand 
malware threats, the impact they can have on your business, and how you 
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage

2011-01-14 Thread CASOLI Jules
Hooo, well done! This is it.

I didn't knew about caching...
I was indeed using ipython, but I did led some test using the basic python 
interpreter,with same results, so I did not mention this point.
In fact, python's basic interpreter still records the last three outputs. As my 
tests were really short (plt.close() ; mpl.cbook.report_memory() ; gc.collect() 
is only two lines before the collect, only o)ne o,f theme outputt ing 
something) even pyhton's caching was still at work, and the garbage collector 
could not free anything.

Thanks a lot, and also thanks to Ben for taking interest !

Jules

PS : Gary, sorry, for the duplicated mail...

Le 14 janv. 2011 à 04:04, gary ruben a écrit :

 You're not doing this from ipython are you? It's cache hangs onto the
 plot object references and stops python's garbage collector from
 releasing them. If so, you can disable the cache as a workaround. A
 better option would be if ipython implemented an option to avoid
 caching references to matplotlib objects.
 
 Gary R.
 
 On Fri, Jan 14, 2011 at 2:59 AM, Benjamin Root ben.r...@ou.edu wrote:
 On Thu, Jan 13, 2011 at 7:54 AM, CASOLI Jules jules.cas...@cea.fr wrote:
 
 Hello to all,
 
 This is yet another question about matplotlib not freeing memory, when
 closing a figure (using close()).
 Here is what I'm doing (tried with several backends, on MacOSX and Linux,
 with similar results):
 
 import matplotlib as mpl
 from matplotlib import pylot as plt
 import numpy as np
 
 a = np.arange(100)
 mpl.cbook.report_memory()
 # - output: 54256
 plt.plot(a)
 mpl.cbook.report_memory()
 # - output: 139968
 plt.close()
 mpl.cbook.report_memory()
 # - output: 138748
 
 
 Shouldn't plt.close() close the figure _and_ free the memory used by it?
 What am I doing wrong ?
 I tried several other ways to free the memory, such as f = figure(); ... ;
 del f, without luck.
 
 Any help appreciated !
 
 P.S. : side question : how come the call to plot take so much memory (90MB
 for a 8MB array ?). I have read somewhere that each point is coded on three
 RGB floats, but it only means an approx. 12MB plot... (plus small overhead)
 
 Jules
 
 
 
 Jules,
 
 Which version of Matplotlib are you using and which backend?  On my Linux
 install of matplotlib (development branch) using GTKAgg, the memory usage
 does get high during the call to show(), but returns to (near) normal
 amounts after I close.  An interesting observation is that if the
 interactive mode is off, the memory usage returns back to just a few
 kilobytes above where it was before, but if interactive mode was turned on,
 the memory usage returned to being a few hundred kilobytes above where it
 started.
 
 Ben Root
 
 P.S. - As a side note, estimating the memory size of these plots from the
 given data isn't as straight-forward as multiplying by three (actually, it
 would be four because of the transparency value in addition to rgb).  There
 are many other parts of the graph that needs to be represented (all having
 rgba values) but there are also a lot of simplifications that are done to
 reduce the amount of memory needed to represent these objects.
 
 --
 Protect Your Site and Customers from Malware Attacks
 Learn about various malware tactics and how to avoid them. Understand
 malware threats, the impact they can have on your business, and how you
 can protect your company and customers by using code signing.
 http://p.sf.net/sfu/oracle-sfdevnl
 ___
 Matplotlib-users mailing list
 Matplotlib-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/matplotlib-users
 
 


--
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand 
malware threats, the impact they can have on your business, and how you 
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage

2011-01-13 Thread Benjamin Root
On Thu, Jan 13, 2011 at 7:54 AM, CASOLI Jules jules.cas...@cea.fr wrote:

 Hello to all,

 This is yet another question about matplotlib not freeing memory, when
 closing a figure (using close()).
 Here is what I'm doing (tried with several backends, on MacOSX and Linux,
 with similar results):
 
 import matplotlib as mpl
 from matplotlib import pylot as plt
 import numpy as np

 a = np.arange(100)
 mpl.cbook.report_memory()
 # - output: 54256
 plt.plot(a)
 mpl.cbook.report_memory()
 # - output: 139968
 plt.close()
 mpl.cbook.report_memory()
 # - output: 138748
 

 Shouldn't plt.close() close the figure _and_ free the memory used by it?
 What am I doing wrong ?
 I tried several other ways to free the memory, such as f = figure(); ... ;
 del f, without luck.

 Any help appreciated !

 P.S. : side question : how come the call to plot take so much memory (90MB
 for a 8MB array ?). I have read somewhere that each point is coded on three
 RGB floats, but it only means an approx. 12MB plot... (plus small overhead)

 Jules



Jules,

Which version of Matplotlib are you using and which backend?  On my Linux
install of matplotlib (development branch) using GTKAgg, the memory usage
does get high during the call to show(), but returns to (near) normal
amounts after I close.  An interesting observation is that if the
interactive mode is off, the memory usage returns back to just a few
kilobytes above where it was before, but if interactive mode was turned on,
the memory usage returned to being a few hundred kilobytes above where it
started.

Ben Root

P.S. - As a side note, estimating the memory size of these plots from the
given data isn't as straight-forward as multiplying by three (actually, it
would be four because of the transparency value in addition to rgb).  There
are many other parts of the graph that needs to be represented (all having
rgba values) but there are also a lot of simplifications that are done to
reduce the amount of memory needed to represent these objects.
--
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand 
malware threats, the impact they can have on your business, and how you 
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory usage

2011-01-13 Thread gary ruben
You're not doing this from ipython are you? It's cache hangs onto the
plot object references and stops python's garbage collector from
releasing them. If so, you can disable the cache as a workaround. A
better option would be if ipython implemented an option to avoid
caching references to matplotlib objects.

Gary R.

On Fri, Jan 14, 2011 at 2:59 AM, Benjamin Root ben.r...@ou.edu wrote:
 On Thu, Jan 13, 2011 at 7:54 AM, CASOLI Jules jules.cas...@cea.fr wrote:

 Hello to all,

 This is yet another question about matplotlib not freeing memory, when
 closing a figure (using close()).
 Here is what I'm doing (tried with several backends, on MacOSX and Linux,
 with similar results):
 
 import matplotlib as mpl
 from matplotlib import pylot as plt
 import numpy as np

 a = np.arange(100)
 mpl.cbook.report_memory()
 # - output: 54256
 plt.plot(a)
 mpl.cbook.report_memory()
 # - output: 139968
 plt.close()
 mpl.cbook.report_memory()
 # - output: 138748
 

 Shouldn't plt.close() close the figure _and_ free the memory used by it?
 What am I doing wrong ?
 I tried several other ways to free the memory, such as f = figure(); ... ;
 del f, without luck.

 Any help appreciated !

 P.S. : side question : how come the call to plot take so much memory (90MB
 for a 8MB array ?). I have read somewhere that each point is coded on three
 RGB floats, but it only means an approx. 12MB plot... (plus small overhead)

 Jules



 Jules,

 Which version of Matplotlib are you using and which backend?  On my Linux
 install of matplotlib (development branch) using GTKAgg, the memory usage
 does get high during the call to show(), but returns to (near) normal
 amounts after I close.  An interesting observation is that if the
 interactive mode is off, the memory usage returns back to just a few
 kilobytes above where it was before, but if interactive mode was turned on,
 the memory usage returned to being a few hundred kilobytes above where it
 started.

 Ben Root

 P.S. - As a side note, estimating the memory size of these plots from the
 given data isn't as straight-forward as multiplying by three (actually, it
 would be four because of the transparency value in addition to rgb).  There
 are many other parts of the graph that needs to be represented (all having
 rgba values) but there are also a lot of simplifications that are done to
 reduce the amount of memory needed to represent these objects.

 --
 Protect Your Site and Customers from Malware Attacks
 Learn about various malware tactics and how to avoid them. Understand
 malware threats, the impact they can have on your business, and how you
 can protect your company and customers by using code signing.
 http://p.sf.net/sfu/oracle-sfdevnl
 ___
 Matplotlib-users mailing list
 Matplotlib-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/matplotlib-users



--
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand 
malware threats, the impact they can have on your business, and how you 
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] Memory Usage

2010-08-18 Thread Friedrich Romstedt
2010/8/16 Craig Lyndon c.a.lyn...@gmail.com:
 If data sets are indeed stored in RAM, is there a way to discard the
 plot data after a plot had been created to leave just a static image?
 Or, read and store data points directly from a file?

You can render to PIL using the Agg backend, and display this via
PIL.ImageTk in Tkinter if you're using Tkinter.

Just an option ... I think this is quite straightforward.

Friedrich

--
This SF.net email is sponsored by 

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev 
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] memory usage (leakage?) in ipython interactive mode

2009-03-06 Thread Gary Ruben
Hi Michael,

Thanks for your explanation. It turns out that it is a combination of
(1) and (3). I hadn't thought about (1) and I hadn't done enough
playing to see the python interpreter releasing blocks of memory. As
you suggested, the solution is to limit the iPython cache by using
the iPython -cs option.

thanks for your help,
Gary

Michael Droettboom wrote:
 There are at least three possible causes of what you're seeing here:
 
 1) ipython stores references to all results in the console.  (ipython 
 maintains a history of results so they can easily be accessed later).  I 
 don't recall the details, but it may be possible to turn this feature 
 off or limit the number of objects stored.
 
 2) matplotlib stores references to all figures until they are explicitly 
 closed with pyplot.close(fignum)
 
 3) Python uses pools of memory, and is often imposes a significant delay 
 returning memory to the operating system.  It is actually very hard to 
 determine from the outside whether something is leaking or just pooling 
 without compiling a special build of Python with memory pooling turned off.
 
 In general, interactive use is somewhat at odds with creating many large 
 plots in a single session, since all of the nice interactive features 
 (history etc.) do not know automagically when the user is done with 
 certain objects.
 
 I am not aware of any memory leaks in current versions of matplotlib 
 with *noninteractive* use, other than small leaks caused by bugs in 
 older versions of some of the GUI toolkits (notably gtk+).  If you find 
 a script that produces a leak reproducibly, please share so we can track 
 down the cause.
 
 Gary Ruben wrote:
 Doing plot(rand(100)) or matshow(rand(1000,1000)) for example eats 
 a big chunk of memory (tried with TkAgg and WxAgg in Windows (mpl 
 v0.98.5.2) and Linux (mpl v0.98.3)), most of which is not returned 
 when the window is closed. The same goes if you create an array, plot 
 it, and explicitly del it after closing the window.
 Can you elaborate on these steps?  It's possible that the del has little 
 effect, since del only deletes a single reference to the object, not all 
 references which may be keeping it alive (such as the figure, which 
 matplotlib itself keeps a reference to).  In general, you need to 
 explicitly call pyplot.close(fignum) to delete a figure.
 
 Cheers,
 Mike
 

--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users


Re: [Matplotlib-users] memory usage (leakage?) in ipython interactive mode

2009-03-05 Thread Michael Droettboom
There are at least three possible causes of what you're seeing here:

1) ipython stores references to all results in the console.  (ipython 
maintains a history of results so they can easily be accessed later).  I 
don't recall the details, but it may be possible to turn this feature 
off or limit the number of objects stored.

2) matplotlib stores references to all figures until they are explicitly 
closed with pyplot.close(fignum)

3) Python uses pools of memory, and is often imposes a significant delay 
returning memory to the operating system.  It is actually very hard to 
determine from the outside whether something is leaking or just pooling 
without compiling a special build of Python with memory pooling turned off.

In general, interactive use is somewhat at odds with creating many large 
plots in a single session, since all of the nice interactive features 
(history etc.) do not know automagically when the user is done with 
certain objects.

I am not aware of any memory leaks in current versions of matplotlib 
with *noninteractive* use, other than small leaks caused by bugs in 
older versions of some of the GUI toolkits (notably gtk+).  If you find 
a script that produces a leak reproducibly, please share so we can track 
down the cause.

Gary Ruben wrote:
 Doing 
 plot(rand(100)) or matshow(rand(1000,1000)) for example eats a big 
 chunk of memory (tried with TkAgg and WxAgg in Windows (mpl v0.98.5.2) 
 and Linux (mpl v0.98.3)), most of which is not returned when the window 
 is closed. The same goes if you create an array, plot it, and explicitly 
 del it after closing the window.
Can you elaborate on these steps?  It's possible that the del has little 
effect, since del only deletes a single reference to the object, not all 
references which may be keeping it alive (such as the figure, which 
matplotlib itself keeps a reference to).  In general, you need to 
explicitly call pyplot.close(fignum) to delete a figure.

Cheers,
Mike

-- 
Michael Droettboom
Science Software Branch
Operations and Engineering Division
Space Telescope Science Institute
Operated by AURA for NASA


--
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
___
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users