xrange should be more memory efficient than range: http://stackoverflow.com/questions/135041/should-you-always-favor-xrange-over-range
Replacing arrays with lists is probably a bad idea for a lot of reasons. You'll lose nice vectorization of simple operations, and all of numpy's other benefits. To be more parsimonious with memory, you probably want to pay close attention to array data types and make sure that things aren't being automatically upconverted to higher precision data types. I've never considered using reset. Perhaps you need to take a look at your program's structure and make sure that useless arrays can be garbage collected properly. Preallocation of arrays can give you tons of benefits with regards to array size and program speed. If you aren't using preallocation, now's a great time to start. You can pass numpy arrays into Cython functions, and you can also call numpy/scipy functions within Cython functions. Identify your bottlenecks using some kind of profiling, then work on optimizing those with Cython. HTH, Mike On Thu, Sep 26, 2013 at 11:19 AM, Josè Luis Mietta < [email protected]> wrote: > Hi experts! > > I wanna use less RAM memory in my Monte Carlo simulations. In my > algorithm I use numpy arrays and xrange() function. > I hear that I can reduce RAM used in my lagorithm if I do the next: > > 1) replace xrange() for range(). > 2) replace numpya arrays for python lists > 3) use reset() function for deleting useless arrays. > Is that true? > > In adition, I wanna increase execution speed of my code (I use numpy and > SciPy functions). How can I apply Cython? Will it help? > > Please help. > > Thanks a lot!! > > > _______________________________________________ > NumPy-Discussion mailing list > [email protected] > http://mail.scipy.org/mailman/listinfo/numpy-discussion > >
_______________________________________________ NumPy-Discussion mailing list [email protected] http://mail.scipy.org/mailman/listinfo/numpy-discussion
