I'm running operations large arrays of floats, approx 25,000 x 80. Python (scipy) does not seem to come close to using 4GB of wired mem, but segments at around a gig. Everything works fine on smaller batches of data around 10,000 x 80 and uses a max of ~600mb of mem. Any Ideas? Is this just too much data for scipy?
Thanks Conor Traceback (most recent call last): File "C:\Temp\CR_2\run.py", line 68, in ? net.rProp(1.2, .5, .000001, 50.0, input, output, 1) File "/Users/conorrob/Desktop/CR_2/Network.py", line 230, in rProp print scipy.trace(error*scipy.transpose(error)) File "D:\Python24\Lib\site-packages\numpy\core\defmatrix.py", line 149, in __mul__ return N.dot(self, other) MemoryError >>> -- http://mail.python.org/mailman/listinfo/python-list