On 25 October 2012 22:54, David Warde-Farley warde...@iro.umontreal.cawrote:
On Wed, Oct 24, 2012 at 7:18 AM, George Nurser gnur...@gmail.com wrote:
Hi,
I was just looking at the einsum function.
To me, it's a really elegant and clear way of doing array operations,
which
is the core of what numpy is about.
It removes the need to remember a range of functions, some of which I
find
tricky (e.g. tile).
Unfortunately the present implementation seems ~ 4-6x slower than dot or
tensordot for decent size arrays.
I suspect it is because the implementation does not use blas/lapack
calls.
cheers, George Nurser.
Hi George,
IIRC (and I haven't dug into it heavily; not a physicist so I don't
encounter this notation often), einsum implements a superset of what
dot or tensordot (and the corresponding BLAS calls) can do. So, I
think that logic is needed to carve out the special cases in which an
einsum can be performed quickly with BLAS.
Hi David,
Yes, that's my reading of the situation as well.
Pull requests in this vein would certainly be welcome, but requires
the attention of someone who really understands how einsum works/can
work.
...and I guess how to interface w BLAS/LAPACK.
cheers, George.
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion