We generate pairs of properly distributed Gaussian variables at
down to 10nsec intervals, essential in the application. Speed can
be an issue, particularly in real time situations.

Ian

"Glen Barnett" <[EMAIL PROTECTED]> wrote in message
a4plof$p3s$[EMAIL PROTECTED]">news:a4plof$p3s$[EMAIL PROTECTED]...
>
> Art Kendall <[EMAIL PROTECTED]> wrote in message
> [EMAIL PROTECTED]">news:[EMAIL PROTECTED]...
> > I tend to be more concerned with the "apparent randomness" of the
results
> than with the speed of the algorithm.
>
> This will be mainly a function of the randomness of the uniform
generator. If
> we assume the same uniform generator for both, and assuming it's a
pretty good
> one (our current one is reasonable, though I want to go back and
update it
> soon), there shouldn't be a huge difference in the apparent
randomness of the
> resulting gaussians.
>
> > As a thought experiment,  what is the cumulative time difference
in a run
> using the fastest vs the slowest algorithm? A
> > whole minute? A second? A fractional second?
>
> When you need millions of them (as we do; a run of 10,000
simulations could
> need as many as 500 million gaussians, and we sometimes want to do
more than
> 10,000), and you also want your program to be interactive (in the
sense that
> the user doesn't have to wander off and have coffee just to do one
simulation
> run), knowing that one algorithm is, say, 30% faster is kind of
important.
> Particularly if the user may want to do hundreds of simulations...
>
> A whole minute extra on a simulation run is a big difference, if the
user is
> doing simulations all day.
>
> Glen
>
>




=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at
                  http://jse.stat.ncsu.edu/
=================================================================

Reply via email to