Currently there are lots of ways to compute dot products (dot, vdot, inner, 
tensordot, einsum...), but none of them are really convenient for the case of 
arrays of vectors, where one dimension (usually the last or the first) is the 
vector dimension. The simplest way to do this currently is `np.sum(a * b, 
axis=axis)`, but this makes vector algebra less readable without a wrapper 
function, and it's probably not optimized as much as matrix products. Another 
way to do it is by adding appropriate dimensions and using matmul, but that's 
arguably less readable and not obvious to do generically for arbitrary axes. I 
think either np.dot or np.vdot could easily be extended with an `axis` 
parameter that would convert it into a bulk vector operation, with the same 
semantics as `np.sum(a * b, axis=axis)`. It should also maybe have a 
`keep_dims` parameter, which is useful for preserving broadcasting.

I submitted a corresponding issue at https://github.com/numpy/numpy/issues/21915
_______________________________________________
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com

Reply via email to