My mind has just been blown ... I didn't realize you could do element-wise 
operations on types of different shape ... Do any other languages do this? 
I know that the dimension reduction was labelled APL like ... how did APL 
deal with this kind of reduction operation? Off to the docs ... my mind ...

On Wednesday, May 25, 2016 at 7:03:03 PM UTC-7, Tim Holy wrote:
>
> Since you asked...from my perspective, the easiest argument against it is 
>
>     pnormalized = p ./ sum(p, 1) 
>
> If you drop the summed dimension, that won't work anymore. One can write 
>
>     pnormalized = p ./ reshape(sum(p, 1), 1, size(p, 2)) 
>
> but that's a little uglier than the converse, calling `squeeze` when you'd 
> rather get rid of the dimension. 
>
> That said, I recognize that your point has validity. I offer the example 
> just 
> to make sure it's known. 
>
> Best, 
> --Tim 
>
> On Wednesday, May 25, 2016 06:49:53 PM a. kramer wrote: 
> > That's a solution, yes, but my feeling is that the "default" slicing and 
> > reduction behavior should play nice with one another.  In 0.5 it seems 
> > dropping dimensions with A[1,:] is default, which I do prefer.  It seems 
> > natural for sum(A,1) to be equivalent to A[1,:] + A[2,:] + A[3,:] + ... 
> > 
> > In my workflow (primarily data analysis), I often mix these slicing and 
> > reduction operations, but it would be unusual for a situation to appear 
> > where I would want one behavior for slicing and a different one for 
> array 
> > reduction.  In situations where array dimensions correspond to, say, 
> > repeated observations, it is common to compare slices to means or maxima 
> > across dimensions.  However, I would be interested to hear arguments 
> > against. 
> > 
> > On Wednesday, May 25, 2016 at 9:05:04 PM UTC-4, Gabriel Gellner wrote: 
> > > Does it bother you to do the A[[1], :] to keep the old behavior? I 
> haven't 
> > > thought enough about the role of sum, mean etc. 
> > > 
> > > On Wednesday, May 25, 2016 at 5:11:20 PM UTC-7, a. kramer wrote: 
> > >> Apologies in advance if this is something that has been discussed at 
> > >> length already, I wasn't able to find it. 
> > >> 
> > >> In Julia 0.5, if A is a 5x5 matrix, the behavior of A[1,:] will be 
> > >> changed to return a 5-element array instead of a 1x5 array.  However, 
> at 
> > >> least in the current build, sum(A,1) still gives a 1x5 array as it 
> does 
> > >> in 
> > >> earlier versions, and similarly for other array reductions like mean, 
> > >> maximum, etc. 
> > >> 
> > >> I understand that these functions and slicing are fundamentally 
> different 
> > >> things, but I find this a little counter intuitive (at the very least 
> > >> different from numpy's behavior).  I often find myself interested in 
> > >> quantities such as A[1,:] ./ mean(A,1) (the first row of a matrix 
> > >> normalized by the average of each column's entries).  In 0.5, this 
> gives 
> > >> something quite different from what I'm expecting (in fact it gives a 
> 5x5 
> > >> matrix). 
> > >> 
> > >> So my questions are: Is there a discussion of the rationale behind 
> doing 
> > >> things this way?  Is this something that may be changed in the 
> future? 
> > >> If 
> > >> not, is there an alternative to the standard sum, mean, etc. 
> functions 
> > >> that 
> > >> is recommended for this?  Just a liberal use of squeeze()? 
>
>

Reply via email to