Vikalpa Jetly wrote:

>I am reading a very large array (~9000,11000) of 1 byte image values. I need
>to change values in the array that meet a certain condition so I am running
>something like:
>
>b = numpy.where(a>200,0,1)
>
>to create a new array with the changed values. However, I get a
>"MemoryError" everytime I try this. I have over 3gb of RAM on my machine
>(most of which is available). The process runs fine on smaller datasets. Is
>there a maximum array size that numpy handles? Any alternatives/workarounds?
>
>  
>
The MemoryError is a direct result when system malloc fails.    Rather 
than use where with two scalars (you're resulting array will be int32 
and therefore 4-times larger).

Use

b = zeros_like(a)
b[a>200] = 1

which will consume less memory.

-Travis
 

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/numpy-discussion

Reply via email to