On Mo, 2014-07-28 at 15:35 +0200, Sturla Molden wrote:
> On 28/07/14 15:21, alex wrote:
> 
> > Are you sure they always give different results?  Notice that
> > np.ones((N,2)).mean(0)
> > np.ones((2,N)).mean(1)
> > compute means of different axes on transposed arrays so these
> > differences 'cancel out'.
> 
> They will be if different algorithms are used. np.ones((N,2)).mean(0) 
> will have larger accumulated rounding error than np.ones((2,N)).mean(1), 
> if only the latter uses the divide-and-conquer summation.
> 

What I wanted to point out is that to some extend the algorithm does not
matter. You will not necessarily get identical results already if you
use a different iteration order, and we have been doing that for years
for speed reasons. All libs like BLAS do the same.
Yes, the new changes make this much more dramatic, but they only make
some paths much better, never worse. It might be dangerous, but only in
the sense that you test it with the good path and it works good enough,
but later (also) use the other one in some lib. I am not even sure if I

> I would suggest that in the first case we try to copy the array to a 
> temporary contiguous buffer and use the same divide-and-conquer 
> algorithm, unless some heuristics on memory usage fails.
> 

Sure, but you have to make major changes to the buffered iterator to do
that without larger speed implications. It might be a good idea, but it
requires someone who knows this stuff to spend a lot of time and care in
the depths of numpy.

> Sturla
> 
> 
> 
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
> 


_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to