> For dot product I can convince myself this is a math definition thing and 
> accept the
> conjugation. But for "vecmat" why the complex conjugate of the vector? Are we 
> assuming that
> 1D things are always columns. I am also a bit lost on the difference of dot, 
> vdot and vecdot.
>
> Also if __matmul__ and np.matmul give different results, I think you will 
> enjoy many fun tickets.
> Personally I would agree with them no matter what the reasoning was at the 
> time of
> divergence.

For vecmat, the logic is indeed that the 1D vectors are always columns,
so one needs to transpose and for complex add a conjugate.

On the differences between dot, vdot, and vecdot: dot is basically
weird, enough for matmul to be added. vdot flattens the arrays before
calculating x†x, vecdot uses the last axis (or an explicitly given one).

On matmul, indeed we should ensure __matmul__ and np.matmul give
identical results; what I meant was that np.matmul doesn't necessarily
have to do the same special-casing for vectors, it could just deal with
matrices only, and let __matmul__ call vecdot, vecmat, matvec, or matmul
as appropriate.  Anyway, somewhat orthogonal to the discussion here.

-- Marten
_______________________________________________
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com

Reply via email to