Le 16/03/2014 15:39, Eelco Hoogendoorn a écrit :
> Note that I am not opposed to extra operators in python, and only mildly
> opposed to a matrix multiplication operator in numpy; but let me lay out
> the case against, for your consideration.
>
> First of all, the use of matrix semantics relative to arrays
> semantics is extremely rare; even in linear algebra heavy code, arrays
> semantics often dominate. As such, the default of array semantics for
> numpy has been a great choice. Ive never looked back at MATLAB semantics.
>
> Secondly, I feel the urge to conform to a historical mathematical
> notation is misguided, especially for the problem domain of linear
> algebra. Perhaps in the world of mathematics your operation is
> associative or commutes, but on your computer, the order of operations
> will influence both outcomes and performance. Even for products, we
> usually care not only about the outcome, but also how that outcome is
> arrived at. And along the same lines, I don't suppose I need to explain
> how I feel about A@@-1 and the likes. Sure, it isn't to hard to learn or
> infer this implies a matrix inverse, but why on earth would I want to
> pretend the rich complexity of numerical matrix inversion can be mangled
> into one symbol? Id much rather write inv or pinv, or whatever
> particular algorithm happens to be called for given the situation.
> Considering this isn't the num-lisp discussion group, I suppose I am
> hardly the only one who feels so.
>
> On the whole, I feel the @ operator is mostly superfluous. I prefer to
> be explicit about where I place my brackets. I prefer to be explicit
> about the data layout and axes that go into a (multi)linear product,
> rather than rely on obtuse row/column conventions which are not
> transparent across function calls. When I do linear algebra, it is
> almost always vectorized over additional axes; how does a special
> operator which is only well defined for a few special cases of 2d and 1d
> tensors help me with that?

Well, the PEP explains a well-defined logical interpretation for cases 
 >2d, using broadcasting. You can vectorize over additionnal axes.

> On the whole, the linear algebra conventions
> inspired by the particular constraints of people working
> with blackboards, are a rather ugly and hacky beast in my opinion, which
> I feel no inclination to emulate. As a sidenote to the contrary; I love
> using broadcasting semantics when writing papers. Sure, your reviewers
> will balk at it, but it wouldn't do to give the dinosaurs the last word
> on what any given formal language ought to be like. We get to define the
> future, and im not sure the set of conventions that goes under the name
> of 'matrix multiplication' is one of particular importance to the future
> of numerical linear algebra.
>
> Note that I don't think there is much harm in an @ operator; but I don't
> see myself using it either. Aside from making textbook examples like a
> gram-schmidt orthogonalization more compact to write, I don't see it
> having much of an impact in the real world.
>
>
> On Sat, Mar 15, 2014 at 3:52 PM, Charles R Harris
> <[email protected] <mailto:[email protected]>> wrote:
>
>
>
>
>     On Fri, Mar 14, 2014 at 6:51 PM, Nathaniel Smith <[email protected]
>     <mailto:[email protected]>> wrote:
>
>         Well, that was fast. Guido says he'll accept the addition of '@'
>         as an
>         infix operator for matrix multiplication, once some details are
>         ironed
>         out:
>         https://mail.python.org/pipermail/python-ideas/2014-March/027109.html
>         http://legacy.python.org/dev/peps/pep-0465/
>
>         Specifically, we need to figure out whether we want to make an
>         argument for a matrix power operator ("@@"), and what
>         precedence/associativity we want '@' to have. I'll post two separate
>         threads to get feedback on those in an organized way -- this is
>         just a
>         heads-up.
>
>
>     Surprisingly little discussion on python-ideas, or so it seemed to
>     me. Guido came out in favor less than halfway through.
>     Congratulations on putting together a successful proposal, many of
>     us had given up on ever seeing a matrix multiplication operator.
>
>     Chuck
>
>
>     _______________________________________________
>     NumPy-Discussion mailing list
>     [email protected] <mailto:[email protected]>
>     http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
>
>
> _______________________________________________
> NumPy-Discussion mailing list
> [email protected]
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>


---
Ce courrier électronique ne contient aucun virus ou logiciel malveillant parce 
que la protection avast! Antivirus est active.
http://www.avast.com


_______________________________________________
NumPy-Discussion mailing list
[email protected]
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to