Chris Nash writes:

> I'd be interested to hear from anyone who constructs such a 
statistical
> deviation vs logarithm base plot. We may expect such a 
statistical approach
> to suggest a distribution where the overall scaling, and artifacts 
such as
> Noll's islands, manifest themselves in the plot as large deviations 
from
> randomness and spikes in the plot. This is one for the 
statisticians, to
> create a suitable measure of the deviation of these fractional 
parts from a
> uniform distribution on [0,1). Perhaps the sample variance will be 
a good
> first measure, but with only 37 samples and a high degree of
> non-independence, beware!

Sadly the statistical inferences that can be drawn indicate no 
evidence of any deviation from a theoretical "smooth" exponenential 
decay curve. There is a message in the archive on this very point 
(search for "Noll island")

Studies of related large primes e.g. 3*2^n+1, 5*2^n+1 exhibit 
similar distributions, though they do "look less lumpy" to the naked 
eye. (The top limit is around 300,000 rather than 3 million)

The point is that random events *do* tend to occur in clusters. As 
an example, here in Northern Ireland we have already had more 
accidental deaths in house fires this year than we had in the whole 
of 1998, or in the whole of 1997. Politicians may panic, calling for 
compulsory fitting of smoke detectors, etc., but in fact there is no 
evidence that this is anything other than a run of "bad luck".
Similarly I can find no statistically convincing evidence, even at the 
5% level, that the "Noll islands" really do exist.

(The rest of this reply is off-topic. Stop reading now if you object)

> (It may be apocryphal, but apparently some 8-bit machine 
(perhaps Atari?)
> had a means of generating "random" numbers because some memory location was
> subject to "noise" - effectively some component acted as a radio antenna. It
> may even have been by design... but of course results obtained by sampling
> this location for random bits were awful. Being natural they were not only
> non-uniform and non-independent but also subject to their surroundings. Can
> anyone validate this?).

I've never seen a system with a built-in hardware RNG, however I do 
know that no less an authority than von Neumann suggested that 
this was a worthwhile feature to have built into the architecture. Of 
course it has to be properly designed to be of any value. I believe 
von Neumann suggested shot noise from a thermionic valve as a 
suitable source, nowadays few computers incorporate such 
elements, however thermal noise from a high-value resistor would 
do equally well. Or, for that matter, acoustic noise from a 
microphone... the point is that you want to amplify the signal way 
up (distortion, extra noise, interference etc. introduced by this is 
not a problem!) then take only a few of the least significant bits 
output by the ADC.

You *do* need to check the output of many samples taken from 
such a hardware RNG using a variety of statistical tests before you 
can trust it, and you *do* need to test each completed hardware 
RNG individually.

There may also be a need to non-linearly transform values output 
from the RNG if you need to have a smooth flat distribution of 
random values to feed into your application. (Especially if the RNG 
is based on time intervals between shot noise / radioactive decay 
type events)

Nevertheless, done properly, such a technique for generating 
random numbers is *far* superior to the pseudo-random number 
generator functions in standard programming languages.

Regards
Brian Beesley
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

Reply via email to