On 3/8/07, Robert Kern <[EMAIL PROTECTED]> wrote:

Daniel Mahler wrote:
> On 3/8/07, Charles R Harris <[EMAIL PROTECTED]> wrote:

>> Robert thought this might relate to Travis' changes adding broadcasting
to
>> the random number generator. It does seem certain that generating small
>> arrays of random numbers has a very high overhead.
>
> Does that mean someone is working on fixing this?

It's not on the top of my list, no.

> Also what does 'adding broadcasting to the number generator' mean?

normal([[0.0], [0.5]], [1.0, 2.0, 3.0])

That gives you a (2, 3) array of random numbers drawn from 6 different
normal
distributions: [[(mean=0, stdev=1), (mean=0, stdev=2), (mean=0, stdev=3)],
                [(mean=0.5, stdev=1), (mean=0.5, stdev=2), (mean=0.5,
stdev=3)]]


For normals this seems overkill as the same result can be achieved by an
offset and scale, i.e., if r is an array of random numbers with mean 0 and
sigma 1, then

myrandomarray = (r*mysigma + mymean)

easily achieves the same result. Other distributions don't have such happy
properties, unfortunately, and will have high overhead regardless. For
instance, Poisson distributions require a computation of new internal
parameters for each value of the mean and doing this on an item by item
basis over a whole array is a terrible idea. Hmm, I am not convinced that
broadcasting is going to buy you much except overhead. Perhaps this problem
should be approached on a case by case basis rather than by some global
scheme.

Chuck
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to