On Thu, Mar 17, 2016 at 4:41 PM, Stephan Hoyer <sho...@gmail.com> wrote:
> On Thu, Mar 17, 2016 at 1:04 AM, Travis Oliphant <tra...@continuum.io> > wrote: > >> I think that is a good idea. Let the user decide if scalar >> broadcasting is acceptable for their function. >> >> Here is a simple concrete example where scalar broadcasting makes sense: >> >> >> A 1-d dot product (the core of np.inner) (k), (k) -> () >> >> A user would assume they could call this function with a scalar in either >> argument and have it broadcast to a 1-d array. Of course, if both >> arguments are scalars, then it doesn't make sense. >> >> Having a way for the user to allow scalar broadcasting seems sensible and >> a nice compromise. >> >> -Travis >> > > To generalize a little bit, consider the entire family of weighted > statistical function (mean, std, median, etc.). For example, the gufunc > version of np.average is basically equivalent to np.inner with a bit of > preprocessing. > > Arguably, it *could* make sense to broadcast weights when given a scalar: > np.average(values, weights=1.0 / len(values)) is pretty unambiguous. > > That said, adding an explicit "scalar broadcasting OK flag" seems like a > hack that will need even more special logic (e.g., so we can error if both > arguments to np.inner are scalars). > > Multiple dispatch for gufunc core signatures seems like the cleaner > solution. If you want np.inner to handle scalars, you need to supply core > signatures (k),()->() and (),(k)->() along with (k),(k)->(). This is the > similar to vision of three core signatures for np.matmul: (i),(i,j)->(j), > (i,j),(j)->(i) and (i,j),(j,k)->(i,k). > > Maybe someone will even eventually get around to adding an axis/axes > argument so we can specify these core dimensions explicitly. Writing > np.inner(a, b, axes=((-1,), ())) could trigger the (k),()->() signature > even if the second argument is not a scalar (it should be broadcast against > "a" instead). > That's a great idea! Adding multiple-dispatch capability for this case could also solve a lot of issues that right now prevent generalized ufuncs from being the mechanism of implementation of *all* NumPy functions. -Travis > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion@scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- *Travis Oliphant, PhD* *Co-founder and CEO* @teoliphant 512-222-5440 http://www.continuum.io
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion