In a thread early last month I was doing my thing of stirring the pot
by making noise about the equivalence of 'information' and 'uncertainty'
- and I was quoting Shannon to back me up.
We all know that the two concepts are ultimately semantically opposed -
if for no other reason than
That is potentially fascinating. However, it is not terribly interesting to
state that we can establish a conservation principle merely by giving a name to
the absence of something, and then pointing out that if we start with a set
amount of that something, and take it away in chunks, then the
Eric,
True enough. And yet, this is what Information Theory has decided to do:
treat the amount of _information_ that gets realized by performing an
experiment as the same as the amount of _uncertainty_ from which it was
liberated. That way, they can use entropy as the measure of both.
I'm