Upon further investigation, I do believe it is within the scipy code where 
there is a leak. I commented out my call to processBinaryImage(), which is all 
scipy code calls, and my memory usage remains flat with approximately a 1MB 
variation. Any ideas? Right now I am getting around it by checking to see how 
far I got through my dataset, but I have to restart the program after each 
memory crash.


From: numpy-discussion-boun...@scipy.org 
[mailto:numpy-discussion-boun...@scipy.org] On Behalf Of Joseph McGlinchy
Sent: Wednesday, January 29, 2014 11:17 AM
To: Discussion of Numerical Python
Subject: Re: [Numpy-discussion] Memory leak in numpy?

Perhaps it is an ESRI/Arcpy issue then. I don't see anything that could be 
doing that, though, as it is very minimal.

From: 
numpy-discussion-boun...@scipy.org<mailto:numpy-discussion-boun...@scipy.org> 
[mailto:numpy-discussion-boun...@scipy.org] On Behalf Of Benjamin Root
Sent: Wednesday, January 29, 2014 11:10 AM
To: Discussion of Numerical Python
Subject: Re: [Numpy-discussion] Memory leak in numpy?

Hmmm, I see no reason why that would eat up memory.  I just tried it out on my 
own system (numpy 1.6.1, CentOS 6, python 2.7.1), and had no issues, Memory 
usage stayed flat for the 10 seconds it took to go through the loop.  Note, I 
am not using ATLAS or BLAS, so maybe the issue lies there? (i don't know if 
numpy defers the dot-product over to ATLAS or BLAS if they are available)
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to