Eric,

True enough. And yet, this is what Information Theory has decided to do: treat the amount of _information_ that gets realized by performing an experiment as the same as the amount of _uncertainty_ from which it was "liberated". That way, they can use entropy as the measure of both.

I'm personally sympathetic to an argument that they are not equivalent. My predilection suggests that there is more value in the uncertainty that exists before the experiment than there is in the information that results afterwards. I would expect there would be others who would put more value on the "liberated" information.

But I would have to put a lot more thought than I have into formalizing this.

I like your observation. It opens up the possibility of re-doing Information Theory, and ending up with one measure for uncertainty and another for information. And we could finally depose the word "entropy"!

Grant

On 7/20/11 3:18 PM, ERIC P. CHARLES wrote:
That is potentially fascinating. However, it is not terribly interesting to state that we can establish a conservation principle merely by giving a name to the absence of something, and then pointing out that if we start with a set amount of that something, and take it away in chunks, then the amount that is there plus the amount that is gone always equals the amount we started with. What is the additional insight?

Eric

On Wed, Jul 20, 2011 04:27 PM, *Grant Holland <grant.holland...@gmail.com>* wrote:

    In a thread early last month I was doing my thing of "stirring the
    pot" by making noise about the equivalence of 'information' and
    'uncertainty' - and I was quoting Shannon to back me up.

    We all know that the two concepts are ultimately semantically
    opposed - if for no other reason than uncertainty adds to
    confusion and information can help to clear it up. So,
    understandably, Owen - and I think also Frank - objected somewhat
    to my equating them. But I was able to overwhelm the thread with
    more Shannon quotes, so the thread kinda tapered off.

    What we all were looking for, I believe, is for Information Theory
    to back up our common usage and support the notion that
    information and uncertainty are, in some sense, semantically
    opposite; while at the same time they are both measured by the
    same function: Shannon's version of entropy (which is also Gibbs'
    formula with some constants established).

    Of course, Shannon does equate information and uncertainty - at
    least mathematically so, if not semantically so. Within the span
    of three sentences in his famous 1948 paper, he uses the words
    "information", "uncertainty" and "choice" to describe what his
    concept of entropy measures. But he never does get into any
    semantic distinctions among the three - only that all three are
    measured by /entropy/.

    Even contemporary information theorists like Vlatko Vedral,
    Professor of Quantum Information Science at Oxford, appear to be
    of no help with any distinction between 'information' and
    'uncertainty'. In his 2010 book _Decoding Reality: The Universe as
    Quantum Information_, he traces the notion of information back to
    the ancient Greeks.

        "The ancient Greeks laid the foundation for its
        [information's] development when they suggested that the
        information content of an event somehow depends only on how
        probable this event really is. Philosophers like Aristotle
        reasoned that the more surprised we are by an event the more
        information the event carries....

        Following this logic, we conclude that information has to be
        inversely proportional to probability, i. e. events with
        smaller probability carry more information...."

    But a simple inverse proportional formula like I(E) = 1/Pr(E),
    where E is an event, does not suffice as a measure of
    'uncertainty/information', because it does not ensure the
    additivity of independent events. (We really like additivity in
    our measuring functions.) The formula needs to be tweaked to give
    us that.

    Vedral does the tweaking for additivity and gives us the formula
    used by Information Theorists to measure the amount of
    'uncertainty/information' in a single event. The formula is I(E)
    =  log (1/Pr(E)). (Any base will do.) It is interesting that if
    this function is treated as a random variable, then its first
    moment (expected value) is Shannon's formula for entropy.

    But it was the Russian probability theorist A. I. Khinchin who
    provided us with the satisfaction we seek. Seeing that the Shannon
    paper (bless his soul) lacked both mathematical rigor and
    satisfying semantic justifications, he set about to put the
    situation right with his slim but essential little volume entitled
    _The Mathematical Foundations of Information Theory_ (1957). He
    manages to make the pertinent distinction between 'information'
    and 'uncertainty' most cleanly in this single passage. (By
    "scheme" Khinchin means "probability distribution".)

        "Thus we can say that the information given us by carrying out
        some experiment consists of removing the uncertainty which
        existed before the experiment. The larger this uncertainty,
        the larger we consider to be the amount of information
        obtained by removing it. Since we agreed to measure the
        uncertainty of a finite scheme A by its entropy, H(A), it is
        natural to express the amount of information given by removing
        this uncertainty by an increasing function of the quantity
        H(A)....

        Thus, in all that follows, we can consider the amount of
        information given by the realization of a finite scheme
        [probability distribution] to be equal to the entropy of the
        scheme."

    So, when an experiment is "realized" (the coin is flipped or the
    die is rolled), the uncertainty inherent in it "becomes"
    information. And there seems to be a /conservation principle/
    here. The amount of "stuff" inherent in the /uncertainty/ prior to
    realization is conserved after realization when it becomes
    /information/.

    Fun.

    Grant

    On 6/6/11 8:17 AM, Owen Densmore wrote:

        Nick: Next you are in town, lets read the original Shannon paper 
together.
        Alas, it is a bit long, but I'm told its a Good Thing To Do.

                -- Owen

        On Jun 6, 2011, at 7:44 AM, Nicholas Thompson wrote:


            Grant,

            This seems backwards to me, but I got properly thrashed for my last 
few
            postings so I am putting my hat over the wall very carefully here.

            I thought……i thought …. the information in a message was the number 
of
            bits by which the arrival of the message decreased the uncertainty 
of the
            receiver.  So, let’s say you are sitting awaiting the result of a 
coin toss,
            and I am on the other end of the line flipping the coin.  Before I 
say
            “heads” you have 1 bit of uncertainty; afterwards, you have none.

            The reason I am particularly nervous about saying this is that it, 
of course,
            holds out the possibility of negative information.   Some forms of
            communication, appeasement gestures in animals, for instance, have 
the effect
            of increasing the range of behaviors likely to occur in the 
receiver.  This
            would seem to correspond to a negative value for the information 
calculation.

            Nick
            From:friam-boun...@redfish.com  [mailto:friam-boun...@redfish.com] 
On Behalf Of Grant Holland
            Sent: Sunday, June 05, 2011 11:07 PM
            To: The Friday Morning Applied Complexity Coffee Group; Steve Smith
            Subject: Re: [FRIAM] Quote of the week

            Interesting note on "information" and "uncertainty"...

            Information is Uncertainty. The two words are synonyms.

            Shannon called it "uncertainty", contemporary Information theory 
calls it
            "information".

            It is often thought that the more information there is, the less 
uncertainty.
            The opposite is the case.

            In Information Theory (aka the mathematical theory of 
communications) , the
            degree of information I(E) - or uncertainty U(E) - of an event is 
measurable as
            an inverse function of its probability, as follows:

            U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) 
).

            Considering I(E) as a random variable, Shannon's entropy is, in 
fact, the first
            moment (or expectation) of I(E). Shannon entropy = exp( I(E) ).

            Grant

            On 6/5/2011 2:20 PM, Steve Smith wrote:


            "Philosophy is to physics as pornography is to sex. It's cheaper, 
it's easier
            and some people seem to prefer it."

            Modern Physics is  contained in Realism which is contained in 
Metaphysics which
            I contained in all of Philosophy.

            I'd be tempted to counter:
            "Physics is to Philosophy as the Missionary Position is to the Kama 
Sutra"

            Physics also appeals to Phenomenology and Logic (the branch of 
Philosophy were
            Mathematics is rooted) and what we can know scientifically is 
constrained by
            Epistemology (the nature of knowledge) and phenomenology (the 
nature of
            conscious experience).

            It might be fair to say that many (including many of us here) who 
hold Physics
            up in some exalted position simply dismiss or choose to ignore all 
the messy
            questions considered by  *the rest of* philosophy.   Even if we 
think we have
            clear/simple answers to the questions, I do not accept that the 
questions are
            not worthy of the asking.

            The underlying point of the referenced podcast is, in fact, that 
Physics, or
            Science in general might be rather myopic and limited by it's own 
viewpoint by
            definition.

              "The more we know, the less we understand."

            Philosophy is about understanding, physics is about knowledge first 
and
            understanding only insomuch as it is a part of natural philosophy.

            Or at least this is how my understanding is structured around these 
matters.

            - Steve

            On Sun, Jun 5, 2011 at 1:15 PM, Robert 
Holmes<rob...@holmesacosta.com>  wrote:
            > From the BBC's science podcast "The Infinite Monkey Cage":

            "Philosophy is to physics as pornography is to sex. It's cheaper, 
it's easier
            and some people seem to prefer it."

            Not to be pedantic, but I suspect that s/he has conflated 
"philosophy" with
            "new age", as much of science owes itself to philosophy.

            marcos

            ============================================================
            FRIAM Applied Complexity Group listserv
            Meets Fridays 9a-11:30 at cafe at St. John's College
            lectures, archives, unsubscribe, maps athttp://www.friam.org




            ============================================================
            FRIAM Applied Complexity Group listserv
            Meets Fridays 9a-11:30 at cafe at St. John's College
            lectures, archives, unsubscribe, maps athttp://www.friam.org
            ============================================================
            FRIAM Applied Complexity Group listserv
            Meets Fridays 9a-11:30 at cafe at St. John's College
            lectures, archives, unsubscribe, maps athttp://www.friam.org

        ============================================================
        FRIAM Applied Complexity Group listserv
        Meets Fridays 9a-11:30 at cafe at St. John's College
        lectures, archives, unsubscribe, maps athttp://www.friam.org

    ============================================================
    FRIAM Applied Complexity Group listserv
    Meets Fridays 9a-11:30 at cafe at St. John's College
    lectures, archives, unsubscribe, maps at http://www.friam.org

Eric Charles

Professional Student and
Assistant Professor of Psychology
Penn State University
Altoona, PA 16601


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to