On 02-Feb-2000, Matt Harden <[EMAIL PROTECTED]> wrote:
>
> I like the clever code. Unfortunately, it's not just more complex, it's
> less efficient.
It is a little less efficient, but I'm not convinced that
the difference would be significant.
> You will consume at least 3 random numbers from `next`
> each time you need one new random number.
That statement is not correct.
For n > 1, the code that I posted needs only one random number
from `next' in the best case. For n = 1, i.e. if you ask it for
"random" numbers in the range [0, 1), then it will return [0..]
without calling `next' at all.
If the range of the random number generator is indeed exactly [-2^29,
2^29-1], then we already know the full range, and so no extra
calls to `next' will be needed; one call per number extracted
will suffice.
If the range is larger, then some additional calls to `next'
will be needed. If the range is much larger than 2^30,
then at a rough guess (figuring out the exact average here
is a bit complicated) you will need on average about 3 calls
to `next' in total to extract the first random number in a
known range.
Also, if you are extracting a long sequence of random numbers
(e.g. via the `rands' function that I posted, or using the `randoms'
or `randomRs' class methods in the Haskell Library Report, which
could be implemented using the `rands' function I posted),
then I think the asymptotic number of calls to `next'
needed per random number extracted is just one, because
the range would be quickly established.
The code that I posted could be improved on;
e.g. if you're extracting numbers in the range [0, 1],
then you should be able to extract at least 30 of them
per call to `next'. However, that would only help
if you're extracting a sequence of numbers, e.g. via
`randoms' or `randomRs'; it wouldn't help with extracting
numbers one at a time via `random' or `randomR'.
> I also think it could/would distort the statistical characteristics of
> the RNG. I came up with a different hack that would have worked great
> if the RNG was perfect, but as it was, I found it really skewed the mean
> of output numbers.
That is possible, I suppose. If the RNG is perfect, then my
code will preserve that perfection, i.e. the numbers that it
returns will be perfectly uniformly distributed; but if the RNG
is imperfect, then it is possible that my code could magnify
those imperfections.
I don't know much about this topic. One thing I've heard is that some
low-quality RNGs tend to be much less random in the low-order bits.
One possible improvement to my code would be to use the high-order bits
returned from the RNG rather than the low-order bits. In particular,
in the definition of `solve'
> solve val max0 n g rands =
> if max0 >= n then
> let max0' = (max0 `div` n) * n in
> if val < max0' then
> (val `mod` n, g)
> ...
instead of
val `mod` n
it may be better to use
(val * max0) `div` n
> From that experience, I feel like your code would
> have some similar effects. For instance, initial ascending sequences
> from `next` get dropped by getRands:
>
> [1,3,7,10,21,2] -> [2]
>
> Same with descending sequences. That may not seem bad, but getRands is
> called each time the user program requests a new number (via randN or
> randsN). I think that means you'll see longer ascending or descending
> sequences much less commonly from randN than you do from next.
I don't think that conclusion follows.
Certainly if the RNG is truly random, it won't hold.
If the RNG is flawed, my method might perhaps amplify the flaw
in some way, but I think it might be equally likely to
increase the number of ascending or descending sequence
rather than decreasing them -- exactly what the effect was
would depend on the nature of the flaw in the RNG.
But I'm not a mathematician or statistician either,
and I don't know much about the kind of flaws that
commonly used RNGs have.
If you're worried about effects like that, code along the lines
of the code that Tom Pledger posted, which assumed that the range of
`next` (i.e. the range of Int) is a power of 2, would be less likely
to suffer from problems relating to the amplification of imperfections
in the RNG.
--
Fergus Henderson <[EMAIL PROTECTED]> | "I have always known that the pursuit
WWW: <http://www.cs.mu.oz.au/~fjh> | of excellence is a lethal habit"
PGP: finger [EMAIL PROTECTED] | -- the last words of T. S. Garp.