If you are going to start a new topic, please start a new thread as well, 
instead of replying to a message. Thanks.

Vikalpa Jetly wrote:
> I am reading a very large array (~9000,11000) of 1 byte image values. I need
> to change values in the array that meet a certain condition so I am running
> something like:
> 
> b = numpy.where(a>200,0,1)
> 
> to create a new array with the changed values. However, I get a
> "MemoryError" everytime I try this. I have over 3gb of RAM on my machine
> (most of which is available). The process runs fine on smaller datasets. Is
> there a maximum array size that numpy handles? Any alternatives/workarounds?

There is no predefined limit on the array size, just your memory.

However, note that what you are doing here is creating an int32 (or int64 if 
you 
are on a 64-bit machine) array since you are using Python integers in your 
where() function. You can save quite a bit of memory by using the array from 
(a>200) and simply casting it to the uint8 type. That way, you only have two 
arrays in memory at any time, a and b.

   b = (a > 200).astype(numpy.uint8)

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
  that is made terrible by our own mad attempt to interpret it as though it had
  an underlying truth."
   -- Umberto Eco


-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/numpy-discussion

Reply via email to