Muffles, on 2011-04-06 02:26,  wrote:
> i have made a python script to plot some netcdf files that go from 4mb to
> 80mb, and the main purpose is to plot one after another. This seems to work
> fine, no problems there, but sometimes the computer crashes and i belive its
> beacause the files are too big. Is there any way to accelerate the plot time
> or reduce memory consumption? 
> 
> If your asking whats the error, well, actually i get no error, it just never
> finishes plotting. I tried to wait for like 10 hours, and nothing happened
> so i assume it crashed.

How many such files would you say you're plotting?

If you are creating multiple figures as you go - are you closing
the ones you no longer need?

If you are using IPython, did you disable the output caching?

I realize this'll be difficult, but if you can reduce the
problematic code to a standalone script that reproduces the
problem, we'd be able to much more definitively track down the
problem and find a solution.

best,
-- 
Paul Ivanov
314 address only used for lists,  off-list direct email at:
http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7 

Attachment: signature.asc
Description: Digital signature

------------------------------------------------------------------------------
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users

Reply via email to