On 2/1/11 12:39 AM, Asmi Shah wrote:
> I have one more question: how to avoid the limitation of memoryerror in
> numpy. as I have like 200 images to stack in the numpy array of say
> 1024x1344 resolution.. have any idea apart from downsampling?

If I'm doing my math right, that's 262 MB, shouldn't be a problem in 
modern systems. That's 8bit, but 786MB if 24 bit RGB.

If you are careful about how many copies you're keeping around 
(including temporaries), you mau be OK still.

But if you really have big collections of images, you might try memory 
mapped arrays -- as Sturla pointed out they wont' let you create monster 
arrays on a 32 bit python, but maybe they do help with not clogging up 
memory too much? I don't know -- I haven't used them -- presumably they 
have a purpose.

Also, pytables is worth a look, as another way to get HDF5 on disk, but 
I think more "natural" access.

-Chris







-- 
Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to