On 04/05/2011 11:26 PM, Muffles wrote:
>
> Hello all,
> i have made a python script to plot some netcdf files that go from 4mb to
> 80mb, and the main purpose is to plot one after another. This seems to work
> fine, no problems there, but sometimes the computer crashes and i belive its
> beacause the files are too big. Is there any way to accelerate the plot time
> or reduce memory consumption?
>
> If your asking whats the error, well, actually i get no error, it just never
> finishes plotting. I tried to wait for like 10 hours, and nothing happened
> so i assume it crashed.
>
> Thx in advance

In addition to what Paul wrote, check the following (assuming you are 
using netCDF4 to read the files; the same may apply when using other 
netcdf interfaces):

Use slicing to extract the variables you are plotting as ndarrays rather 
than trying to call mpl routines with the netCDF4 variable objects 
themselves.  I haven't tracked down exactly what is happening, but I 
have noticed that extracting and plotting can be orders of magnitude 
faster than feeding the netCDF4 variable directly to the mpl plot command.

Eric

------------------------------------------------------------------------------
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-users

Reply via email to