On 15 Jan 2003, Lowell Gilbert wrote:
> "Wilkinson,Alex" <[EMAIL PROTECTED]> writes:
>
> > Can someone recommend to me where I can read up on
> > entropy.
> >
> > ie what it is ? Why we have it ? etc etc
>
> The term "entropy" is often used (in rough analogy to its technical
> meaning in thermodynamics) in computer systems to describe the
> "amount" of "randomness" available to random-number functionality.

## Advertising

The entropy is the mean information per symbol. The higher the entropy,
the more random the language is.
It can be shown that the entropy is maximum when all the symbols in
the alphabet have the same probability 1/n, where n is the number of
symbols in the alphabet. In that case, the entropy has a value of
-n*log2(n). Also, in that case the alphabet is trully random.
>
> What to read depends on why you need to know about it, but you could
> always refer to some manual pages, particularly rndcontrol(8).
If you want a more solid background, get an introductory book in
information/coding theory. But don't try it if you don't know calculus,
probability and algebra.
Fer
To Unsubscribe: send mail to [EMAIL PROTECTED]
with "unsubscribe freebsd-questions" in the body of the message