On Thu, Mar 20, 2014 at 4:01 AM, Andrew Dalke da...@dalkescientific.com wrote:
My preference is for same-left. I rarely work with numpy, and it's more
likely that I'll see '@' used in a non-numpy context. That is, people
in general will see @ as a sort of free-for-all operator, to use and
On Mar 20, 2014, at 10:07 AM, Robert Kern wrote:
I think the operator-overload-as-DSL use cases actually argue somewhat
for right-associativity. ... Right-associativity adds some diversity
into the ecosystem and opens up some design space.
You say that like it's a good thing.
My argument is
On 03/19/2014 08:45 PM, josef.p...@gmail.com wrote:
On Wed, Mar 19, 2014 at 2:24 PM, Nathaniel Smith n...@pobox.com
mailto:n...@pobox.com wrote:
On Tue, Mar 18, 2014 at 9:14 AM, Robert Kern robert.k...@gmail.com
mailto:robert.k...@gmail.com wrote:
On Tue, Mar 18, 2014 at
On Thu, Mar 20, 2014 at 1:10 PM, Andrew Dalke da...@dalkescientific.com wrote:
On Mar 20, 2014, at 10:07 AM, Robert Kern wrote:
I think the operator-overload-as-DSL use cases actually argue somewhat
for right-associativity. ... Right-associativity adds some diversity
into the ecosystem and
On 03/20/2014 02:26 PM, Dag Sverre Seljebotn wrote:
On 03/19/2014 08:45 PM, josef.p...@gmail.com wrote:
On Wed, Mar 19, 2014 at 2:24 PM, Nathaniel Smith n...@pobox.com
mailto:n...@pobox.com wrote:
On Tue, Mar 18, 2014 at 9:14 AM, Robert Kern robert.k...@gmail.com
On Thu, Mar 20, 2014 at 1:36 PM, Dag Sverre Seljebotn
d.s.seljeb...@astro.uio.no wrote:
On 03/20/2014 02:26 PM, Dag Sverre Seljebotn wrote:
I'm positive to the chained @ idea, I think it's the answer to what we
really want.
Sorry, I totally misunderstood this. The question is of course how
On Thu, Mar 20, 2014 at 9:07 AM, Robert Kern robert.k...@gmail.com wrote:
I think the operator-overload-as-DSL use cases actually argue somewhat
for right-associativity. There is no lack of left-associative
operators for these use cases to choose from since they usually don't
have numeric or
On Wed, Mar 19, 2014 at 7:45 PM, Nathaniel Smith n...@pobox.com wrote:
Okay, I wrote a little script [1] to scan Python source files look for
things like 'dot(a, dot(b, c))' or 'dot(dot(a, b), c)', or the ndarray.dot
method equivalents. So what we get out is:
- a count of how many 'dot' calls
On Thu, Mar 20, 2014 at 1:36 PM, Dag Sverre Seljebotn
d.s.seljeb...@astro.uio.no wrote:
On 03/20/2014 02:26 PM, Dag Sverre Seljebotn wrote:
Order-of-matrix-multiplication is literally my textbook example of a
dynamic programming problem with complexity O(n^2) where n is number of
terms (as in,
On Thu, Mar 20, 2014 at 1:25 PM, Nathaniel Smith n...@pobox.com wrote:
On Wed, Mar 19, 2014 at 7:45 PM, Nathaniel Smith n...@pobox.com wrote:
Okay, I wrote a little script [1] to scan Python source files look for
things like 'dot(a, dot(b, c))' or 'dot(dot(a, b), c)', or the ndarray.dot
On Mar 20, 2014, at 3:02 PM, Nathaniel Smith wrote:
- And anyway, my impression is that python-dev will give these other
possible uses ~zero weight anyway -- if they thought random DSL
operators were important for their own sake, they would have added @
long ago :-).
Unlike what you all seem
On Thu, Mar 20, 2014 at 8:38 PM, Andrew Dalke da...@dalkescientific.com wrote:
You say we've been asked to report back on what design of @ will
be best for the numeric community, since that's where we have special
expertise that python-dev lacks. I don't really think that goal
means you can
On Thu, Mar 20, 2014 at 9:10 AM, Andrew Dalke da...@dalkescientific.comwrote:
In DSL space, that means @ could be used as the inverse of ** by those
who want to discard any ties to its use in numerics. Considering it
now, I agree this would indeed open up some design space.
I don't see
On Mar 20, 2014, at 10:39 PM, Robert Kern wrote:
Sure, but that discussion will (and should) happen on python-ideas.
When Nathaniel says that we have been asked to answer this very
specific question, he means that literally.
Ah, now I understand.
Thanks!
On Tue, Mar 18, 2014 at 9:14 AM, Robert Kern robert.k...@gmail.com wrote:
On Tue, Mar 18, 2014 at 12:54 AM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 6:28 PM, Nathaniel Smith n...@pobox.com wrote:
Mathematica: instead of having an associativity, a @ b @ c gets
converted
On Wed, Mar 19, 2014 at 2:24 PM, Nathaniel Smith n...@pobox.com wrote:
On Tue, Mar 18, 2014 at 9:14 AM, Robert Kern robert.k...@gmail.com
wrote:
On Tue, Mar 18, 2014 at 12:54 AM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 6:28 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 3:41 AM, Nathaniel Smith n...@pobox.com wrote:
I think we need to
know something about how often the Mat @ Mat @ vec type cases arise in
practice. How often do non-scalar * and np.dot show up in the same
expression? How often does it look like a * np.dot(b, c), and how
On Mar 15, 2014, at 4:41 AM, Nathaniel Smith wrote:
OPTION 1 FOR @: ... same-left
OPTION 2 FOR @: ... weak-right
OPTION 3 FOR @: ... tight-right
(In addition to more unusual forms, like 'grouping'.)
There's another option, which is to refuse the temptation to guess,
and not allow X @ Y @ Z
Perhaps this a bit of a thread hyjack; but this discussion got me thinking
about how to arrive
at a more vectorized/tensorified way of specifying linear algebra
operations, in an elegant manner.
I probably got a little carried away, but what about this syntax?
- indexing/calling an ndarray
Just add one vote: I am for
* right association *
because 1) I'm thinking of matrix multiplication more like operators,
which I also learned to work from right to left and because 2) I would
put a vector to the right, which would result in better performance.
I don't have an opinion on
On Tue, Mar 18, 2014 at 12:54 AM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 6:28 PM, Nathaniel Smith n...@pobox.com wrote:
Mathematica: instead of having an associativity, a @ b @ c gets
converted into mdot([a, b, c])
So, I've been thinking about this (thanks to @rfateman
To elaborate a little on such a more general and explicit method of
specifying linear operations (perhaps 'expressions with named axes' is a
good nomer to cover this topic).
I think indexing rather than calling is preferable. I worried at first
about the performance overhead of checking for
*About weak-left.* You need to define a priority of @ the matrix product
regarding to * the elementwise product because (A*B)@C A*(B@C) : see the
example above. I say that also from a mathematical point of view.
Using mathematical like notations, Matrix1 * Matrix2 * 3 can be written
because
On Tue, Mar 18, 2014 at 3:22 PM, Christophe Bal projet...@gmail.com wrote:
About weak-left. You need to define a priority of @ the matrix product
regarding to * the elementwise product because (A*B)@C A*(B@C) : see the
example above. I say that also from a mathematical point of view.
What
Strange, Gmail has cut my example.
Here it is normally.
*[1 2]*
*A = [3 4]*
*[5 6]*
*B = [7 8]*
*[a d]*
*C = [b c]*
*(A*B)@C*
*=*
*[5 12] [a d]*
*[21 32] @ [b c]*
*=*
*[5a+12b 5d+12c ]*
*[21a+32b 21d+32c]*
*A*(B@C)*
*=*
*[1 2] [5a+6b 5d+6c]*
*[3 4] * [7a+8b 7d+8c]*
*=*
When I write using mathematical like notations..., Matrix1 * Matrix2 is a
matrix multiplication.
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
On Tue, Mar 18, 2014 at 9:50 AM, Eelco Hoogendoorn
hoogendoorn.ee...@gmail.com wrote:
To elaborate a little on such a more general and explicit method of
specifying linear operations (perhaps 'expressions with named axes' is a
good nomer to cover this topic).
[...]
This is a good topic to
On Tue, Mar 18, 2014 at 3:22 PM, Christophe Bal projet...@gmail.com wrote:
About weak-left. You need to define a priority of @ the matrix product
regarding to * the elementwise product because (A*B)@C A*(B@C)
This doesn't follow. (a / b) * c != a / (b * c), but / and * in
Python have the same
I'm still bothered by what Nathaniel mentioned about mixing 1d and 2d arrays
c = np.arange(4)
a = np.arange(16).reshape(4,4)
cc = c[:,None]
a.dot(c).dot(c.T)
420
a.dot(c.dot(c.T))
array([[ 0, 14, 28, 42],
[ 56, 70, 84, 98],
[112, 126, 140, 154],
[168, 182, 196,
This is a different situation because / is indeed an hidden multiplication
: a/b = a*inv(b). The same is true for + and - : a-b=a+opp(b). What I'm
saying is that these operations * and / are indeed of the very same j-kind.
This is not the same for * and @.
2014-03-18 17:53 GMT+01:00 Nathaniel
On 18 Mar 2014 17:32, Christophe Bal projet...@gmail.com wrote:
This is a different situation because / is indeed an hidden
multiplication : a/b = a*inv(b). The same is true for + and - :
a-b=a+opp(b). What I'm saying is that these operations * and / are indeed
of the very same j-kind.
This is
I think that there is very big misunderstanding. My point of view is both a
mathematical and a programmagical one.
Le 18 mars 2014 20:20, Nathaniel Smith n...@pobox.com a écrit :
On 18 Mar 2014 17:32, Christophe Bal projet...@gmail.com wrote:
This is a different situation because / is indeed
On Sat, Mar 15, 2014 at 7:01 PM, Alexander Belopolsky ndar...@mac.com wrote:
On Sat, Mar 15, 2014 at 2:25 PM, Alexander Belopolsky ndar...@mac.com
wrote:
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Here's the main blocker for adding a matrix multiply operator '@'
On Mon, Mar 17, 2014 at 11:48 AM, Nathaniel Smith n...@pobox.com wrote:
One more question that I think should be answered by the PEP and may
influence the associativity decision is what happens if in an A @ B @ C
expression, each operand has its own type that defines __matmul__ and
On Mon, Mar 17, 2014 at 3:48 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 7:01 PM, Alexander Belopolsky ndar...@mac.com wrote:
One more question that I think should be answered by the PEP and may
influence the associativity decision is what happens if in an A @ B @ C
On Mon, Mar 17, 2014 at 4:09 PM, Alexander Belopolsky ndar...@mac.com wrote:
On Mon, Mar 17, 2014 at 11:48 AM, Nathaniel Smith n...@pobox.com wrote:
One more question that I think should be answered by the PEP and may
influence the associativity decision is what happens if in an A @ B @ C
On Mon, Mar 17, 2014 at 12:13 PM, Nathaniel Smith n...@pobox.com wrote:
In practice all
well-behaved classes have to make sure that they implement __special__
methods in such a way that all the different variations work, no
matter which class ends up actually handling the operation.
On Mon, Mar 17, 2014 at 12:50 PM, Alexander Belopolsky ndar...@mac.comwrote:
On Mon, Mar 17, 2014 at 12:13 PM, Nathaniel Smith n...@pobox.com wrote:
In practice all
well-behaved classes have to make sure that they implement __special__
methods in such a way that all the different variations
On Mon, Mar 17, 2014 at 1:18 PM, josef.p...@gmail.com wrote:
On Mon, Mar 17, 2014 at 12:50 PM, Alexander Belopolsky ndar...@mac.comwrote:
On Mon, Mar 17, 2014 at 12:13 PM, Nathaniel Smith n...@pobox.com wrote:
In practice all
well-behaved classes have to make sure that they implement
On Mon, Mar 17, 2014 at 2:55 PM, josef.p...@gmail.com wrote:
I'm again in favor of left, because it's the simplest to understand
A.dot(B).dot(C)
+1
Note that for many years to come the best option for repeated matrix
product will be A.dot(B).dot(C) ...
People who convert their
In article
CAPJVwBkLww7-ysZB76LMRZ+mmbyN_5T=ym_vu1pjgakrlbq...@mail.gmail.com,
Nathaniel Smith n...@pobox.com wrote:
OPTION 1 FOR @:
Precedence: same as *
Associativity: left
My shorthand name for it: same-left (yes, very creative)
This means that if you don't use parentheses, you get:
Hello,
and what about something like that ?
a @ b @ c - (a @ b) @ c
a * b @ c - (a * b) @ c
a @ b * c - a @ (b * c)
Easy to remember. The *-product has priority to @-product, and then we just
to @-product from left to right.
An advantage of this is that parsers do job from left to right
Sorry for all the misspellings...
2014-03-17 22:32 GMT+01:00 Christophe Bal projet...@gmail.com:
Hello,
and what about something like that ?
a @ b @ c - (a @ b) @ c
a * b @ c - (a * b) @ c
a @ b * c - a @ (b * c)
Easy to remember. The *-product has priority to @-product, and then
Here is the translation. ;-)
Hello,
and what about something like that ?
*a @ b @ c - (a @ b) @ c*
*a * b @ c - (a * b) @ c*
*a @ b * c - a @ (b * c)*
Easy to remember: the *-product has priority regarding to the @-product,
and we just do @-product from left to right.
An advantage of
On Mon, Mar 17, 2014 at 9:38 PM, Christophe Bal projet...@gmail.com wrote:
Here is the translation. ;-)
Hello,
and what about something like that ?
a @ b @ c - (a @ b) @ c
a * b @ c - (a * b) @ c
a @ b * c - a @ (b * c)
Easy to remember: the *-product has priority regarding to the
I think that weak-left is a little strange, just think a little of the
operators used by mathematicians that always follow a hierarchy.
A parser is mostly done using grammars : see
http://docs.python.org/3.1/reference/grammar.html.
Defining *-product to have stronger priority than the @-product,
On Mon, Mar 17, 2014 at 6:33 PM, Christophe Bal projet...@gmail.com wrote:
Defining *-product to have stronger priority than the @-product, and this
last having stronger priority than +, will make the changes in the grammar
easier.
The easiest is to give @ the same precedence as *. This
I'm now convinced of the usefulness of @ and @@ too but I also think that
you must think of other uses than only for numpy. In other words, numpy is
a the good argument for this new operators, but this can also open new
perspectives for other uses.
Speaking of `@@`, would the relative
First of all I'm must be very tired because I've written *I think that
weak-left is a little strange...* instead of *I think that same-left is a
little strange...*. It is the night in french... ;-)
So I'm definitely for the weak-left !
Here is my answer to Alexander Belopolsky.
You are right
If you see the operators as following a hierarchy, the answer is simply yes.
2014-03-18 0:16 GMT+01:00 Bago mrb...@gmail.com:
I'm now convinced of the usefulness of @ and @@ too but I also think that
you must think of other uses than only for numpy. In other words, numpy is
a the good
On Mon, Mar 17, 2014 at 11:16 PM, Bago mrb...@gmail.com wrote:
Speaking of `@@`, would the relative precedence of @ vs * be the same as @@
vs **?
This is one of the concerns that made Guido leery of @@ (but only one
of them). Since we seem to be dropping @@:
On Mon, Mar 17, 2014 at 10:33 PM, Christophe Bal projet...@gmail.com wrote:
I think that weak-left is a little strange, just think a little of the
operators used by mathematicians that always follow a hierarchy.
Not sure what you mean -- I don't think most mathematicians think that
scalar and
On Mon, Mar 17, 2014 at 6:33 PM, Christophe Bal projet...@gmail.com wrote:
I think that weak-left is a little strange, just think a little of the
operators used by mathematicians that always follow a hierarchy.
A parser is mostly done using grammars : see
This follows the principle that it's better to be great
at some things than to be mediocre at everything.
You're right.
I think that weak-left is a little strange, just think
a little of the operators used by mathematicians that
always follow a hierarchy.
Not sure what you mean -- I
On Tue, Mar 18, 2014 at 12:16 AM, Christophe Bal projet...@gmail.com wrote:
I think that weak-left is a little strange, just think
a little of the operators used by mathematicians that
always follow a hierarchy.
Not sure what you mean -- I don't think most mathematicians
think that scalar
On Sat, Mar 15, 2014 at 6:28 PM, Nathaniel Smith n...@pobox.com wrote:
Mathematica: instead of having an associativity, a @ b @ c gets
converted into mdot([a, b, c])
So, I've been thinking about this (thanks to @rfateman for pointing it
out), and wondering if Mathematica's approach is worth
On Mon, Mar 17, 2014 at 8:37 PM, Russell E. Owen ro...@uw.edu wrote:
After seeing all the traffic on this thread, I am in favor of
same-left because it is easiest to remember:
- It introduces no new rules.
- It is unambiguous. If we pick option 2 or 3 we have no strong reason
to favor one
On Mon, Mar 17, 2014 at 8:54 PM, Nathaniel Smith n...@pobox.com wrote:
Currently Python has 3 different kinds of ops: left-associative (most
of them), right-associative (**), and chaining. Chaining is used for
comparison ops. Example:
a b c
gets parsed to something like
On Mar 17, 2014 5:54 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 6:28 PM, Nathaniel Smith n...@pobox.com wrote:
Mathematica: instead of having an associativity, a @ b @ c gets
converted into mdot([a, b, c])
So, I've been thinking about this (thanks to @rfateman for
On Mon, Mar 17, 2014 at 8:54 PM, Nathaniel Smith n...@pobox.com wrote:
But, this is actually a feature! Because obviously what *should* be
returned in this case is *not* (Mat @ vec) @ Mat, *or* Mat @ (vec @
Mat). Both of those answers are terrible; it's just, if you have an
ordinary
I tend to favor tight-right. The general scheme of precedence more or
less puts heavier operations higher than lighter operations (+ *
**) and @ is heavier than * in my mind. I think tight (either
-right or -left) has a good correspondence with current dot()
expressions, so it will make
I favor the weak right option.
1) Giving '*' higher precedence than `@` makes it easier, to my mind, to
parse out what is going to happen: all the element-wise multiplications,
followed by the matrix operations. I'd probably still use parenthesis for
clarity.
2) Right associative has the
On Sat, Mar 15, 2014 at 2:49 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
I favor the weak right option.
1) Giving '*' higher precedence than `@` makes it easier, to my mind, to
parse out what is going to happen: all the element-wise multiplications,
followed by the matrix
On Sat, Mar 15, 2014 at 11:44 AM, Robert Kern robert.k...@gmail.com wrote:
I tend to favor tight-right. The general scheme of precedence more or
less puts heavier operations higher than lighter operations (+ *
**) and @ is heavier than * in my mind. I think tight (either
-right or -left) has
On Sat, Mar 15, 2014 at 9:58 AM, Robert Kern robert.k...@gmail.com wrote:
On Sat, Mar 15, 2014 at 2:49 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
I favor the weak right option.
1) Giving '*' higher precedence than `@` makes it easier, to my mind, to
parse out what is going
Oops, make that '*' is *left* associative.
snip
Chuck
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
On Sat, Mar 15, 2014 at 4:40 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
On Sat, Mar 15, 2014 at 9:58 AM, Robert Kern robert.k...@gmail.com wrote:
On Sat, Mar 15, 2014 at 2:49 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
I favor the weak right option.
1) Giving '*'
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Here's the main blocker for adding a matrix multiply operator '@' to
Python: we need to decide what we think its precedence and associativity
should be.
I am not ready to form my own opinion, but I hope the following
On Sat, Mar 15, 2014 at 3:41 AM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Here's the main blocker for adding a matrix multiply operator '@' to Python:
we need to decide what we think its precedence and associativity should be.
Another data point that might be useful:
Matlab: same-left
On Sat, Mar 15, 2014 at 1:28 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 3:41 AM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Here's the main blocker for adding a matrix multiply operator '@' to
Python:
we need to decide what we think its precedence and
Hi Chris,
On Sat, Mar 15, 2014 at 4:15 AM, Chris Laumann chris.laum...@gmail.com wrote:
Hi all,
Let me preface my two cents by saying that I think the best part of @ being
accepted is the potential for deprecating the matrix class — the syntactic
beauty of infix for matrix multiply is a nice
On Sat, Mar 15, 2014 at 6:33 PM, Joe Kington joferking...@gmail.com wrote:
On Sat, Mar 15, 2014 at 1:28 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 3:41 AM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Here's the main blocker for adding a matrix multiply operator
On Sat, Mar 15, 2014 at 2:25 PM, Alexander Belopolsky ndar...@mac.comwrote:
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Here's the main blocker for adding a matrix multiply operator '@' to
Python: we need to decide what we think its precedence and associativity
On Sat, Mar 15, 2014 at 12:40 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 6:33 PM, Joe Kington joferking...@gmail.com
wrote:
On Sat, Mar 15, 2014 at 1:28 PM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 3:41 AM, Nathaniel Smith n...@pobox.com wrote:
On Sat, Mar 15, 2014 at 1:01 PM, Alexander Belopolsky ndar...@mac.comwrote:
On Sat, Mar 15, 2014 at 2:25 PM, Alexander Belopolsky ndar...@mac.comwrote:
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Here's the main blocker for adding a matrix multiply operator '@'
On 15 Mar 2014 19:02, Charles R Harris charlesr.har...@gmail.com wrote:
Just to throw something new into the mix
u@v@w = u@(v@w) -- u@v is a dyadic matrix
u@v -- is a scalar
It would be nice if u@v@None, or some such, would evaluate as a dyad. Or
else we will still need the concept of row
On Sat, Mar 15, 2014 at 3:29 PM, Nathaniel Smith n...@pobox.com wrote:
It would be nice if u@v@None, or some such, would evaluate as a dyad.
Or else we will still need the concept of row and column 1-D matrices. I
still think v.T should set a flag so that one can distinguish u@v.T(dyad)
On Sat, Mar 15, 2014 at 1:29 PM, Nathaniel Smith n...@pobox.com wrote:
On 15 Mar 2014 19:02, Charles R Harris charlesr.har...@gmail.com
wrote:
Just to throw something new into the mix
u@v@w = u@(v@w) -- u@v is a dyadic matrix
u@v -- is a scalar
It would be nice if u@v@None, or
On Sat, Mar 15, 2014 at 4:00 PM, Charles R Harris charlesr.har...@gmail.com
wrote:
These days they are usually written as v*w.T, i.e., the outer product of
two vectors and are a fairly common occurrence in matrix expressions. For
instance, covariance matrices are defined as E(v * v.T)
With
On Sat, Mar 15, 2014 at 2:12 PM, Alexander Belopolsky ndar...@mac.comwrote:
On Sat, Mar 15, 2014 at 4:00 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
These days they are usually written as v*w.T, i.e., the outer product of
two vectors and are a fairly common occurrence in matrix
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Here's the main blocker for adding a matrix multiply operator '@' to
Python: we need to decide what we think its precedence and associativity
should be. I'll explain what that means so we're on the same page, and
On Sat, Mar 15, 2014 at 7:20 PM, josef.p...@gmail.com wrote:
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Here's the main blocker for adding a matrix multiply operator '@' to
Python: we need to decide what we think its precedence and associativity
On Sat, Mar 15, 2014 at 11:30 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
On Sat, Mar 15, 2014 at 7:20 PM, josef.p...@gmail.com wrote:
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Here's the main blocker for adding a matrix multiply
On Sat, Mar 15, 2014 at 10:53 PM, josef.p...@gmail.com wrote:
On Sat, Mar 15, 2014 at 11:30 PM, Charles R Harris
charlesr.har...@gmail.com wrote:
On Sat, Mar 15, 2014 at 7:20 PM, josef.p...@gmail.com wrote:
On Fri, Mar 14, 2014 at 11:41 PM, Nathaniel Smith n...@pobox.com wrote:
Hi all,
Let me preface my two cents by saying that I think the best part of @ being
accepted is the potential for deprecating the matrix class — the syntactic
beauty of infix for matrix multiply is a nice side effect IMHO :) This may be
why my basic attitude is:
I don’t think it matters very
On Fri, Mar 14, 2014 at 9:15 PM, Chris Laumann chris.laum...@gmail.comwrote:
Hi all,
Let me preface my two cents by saying that I think the best part of @
being accepted is the potential for deprecating the matrix class -- the
syntactic beauty of infix for matrix multiply is a nice side
86 matches
Mail list logo