On Wednesday, July 20, 2011, Carlos Becker <carlosbec...@gmail.com> wrote:
> Those are very interesting examples. I think that pre-allocation is very 
> important, and something similar happens in Matlab if no pre-allocation is 
> done: it takes 3-4x longer than with pre-allocation.The main difference is 
> that Matlab is able to take into account a pre-allocated array/matrix, 
> probably avoiding the creation of a temporary and writing the results 
> directly in the pre-allocated array.
>
> I think this is essential to speed up numpy. Maybe numexpr could handle this 
> in the future? Right now the general use of numexpr is result = 
> numexpr.evaluate("whatever"), so the same problem seems to be there.
>
> With this I am not saying that numpy is not worth it, just that for many 
> applications (specially with huge matrices/arrays), pre-allocation does make 
> a huge difference, especially if we want to attract more people to using 
> numpy.

The ufuncs and many scipy functions take a "out" parameter where you
can specify a pre-allocated array.  It can be a little awkward writing
expressions that way, but the capability is there.

But, ultimately, I think the main value with python and numpy is not
it's speed, but rather the ease of use and how quickly one can develop
working code with it.  If you want to squeeze every CPU resources, you
could program in assembly, but good luck getting that linear solver
done in time.

Don't get me wrong, there is always room for improvements, and I would
love to see numpy go even faster.  However, I doubt that converting
matlab users would *require* speed to be the main selling point.  Ease
of development and full-featured, high-quality standard and
third-party libraries have always been the top selling points for me.

Ben Root
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to