Sorry, the following email should be completely rewritten as a query rather
than an assertion, but I think I am too tired to do it coherently, and I am
hoping that an answer to my question may clarify the conversation. 

Hmmm....
This email from Grant, the next one, and much of the past discussion leads me
to believe that much of the confusion involves shifts in levels of analysis.
Surely, degree of organization is not just correlated with degree of
predictability, but the one is the other quantified in a different manner (and
different measures have different properties desirable for different purposes).
All the examples of situations that "vary orthogonally" seem to involve a shift
of interest in which the narrator (here Grant, but elsewhere others) points out
that something about a system is organized in one sense, but unpredictable in
another. I have yet to detect a principled reason why it couldn't be called 
predictable in the first sense, but disorganized in the other. 

For example, if I tell you that three married couples just entered the room,
you would know that there are now six more people in the room, three men and
three women (as couples are typically ORGANIZED) in that manner. It is not
terribly interesting (to me at least) for you to then point out that the color
of their eyes in difficult to PREDICT. Note, I could just as easily say that
you had a high probability of being correct if you PREDICTED they were three
men and three women, and that the eye color was still mysterious because there
is not clear ORGANIZATION of married vs. unmarried couples based on eye color. 

As a general style of approaching a problem, my bias would be to assert that
there was some underlying phenomenon of interest that has been quantified in
different ways depending on the interests of those doing the quantifying. 

So I guess my question is, on what basis, does one declare one inquiry a
problem of determining level of organization and another inquiry a problem of
determining predictability? 

Eric



On Sat, Aug  7, 2010 04:25 PM, Grant Holland <[email protected]> wrote:
>
>Nick,
>
>
>Lemme try to present three examples of these two orthogonal dimensions
>(Organization/Disorganization dimension vs
>Predictability/Unpredictability dimension).
>
>It all boils down to what phenomena one chooses to be interested in.
>
>(Even if both dimensions are arguably present in a particular
>phenomenon, one can choose to observationally ignore one of them in the
>analysis of that phenomenon.)
>
>
>The first example will be exclusively interested in the
>Organization/Disorganization dimension.
>
>The second example will be exclusively interested in the
>Predictability/Unpredictability dimension.
>
>The third example will be jointly interested in both.
>
>
>Example I: An interest in Organization/Disorganization (structure or
>lack of), with no interest in Predictability/Unpredictability.
>
>Lets say we are interested in observing Hydrogen and Oxygen atoms
>within a small region of space.
>
>These atoms are capable of combining into several possible bonding
>configurations: O2, HO, H2O, etc.
>
>Suppose we observe this region for some finite time and make a list of
>any of these configurations that are observed.
>
>This list captures an interest in Organization/Disorganization, but not
>in Predictability/Unpredictability.
>
>We could go further and even develop a metric for the "degree of
>organization" observed. That would be a continuation of the same
>interest.
>
>
>Example II: An interest in Unpredictability/Predictability, with no
>interest in Organization/Disorganization.
>
>Lets say we are interested, as in Example I above, in observing
>Hydrogen and Oxygen atoms within a small region of space.
>
>However, this time, we have no interest in whether these atoms occur in
>a bonded or unbonded form.
>
>What we are emphasizing instead this time is the probability of
>selecting one of these two atoms at random from the region -
>
>AND in whether or not the resulting probability distribution describes
>an Unpredictable situation, a Predictable situation, or somewhere in
>between.
>
>(Assume that we will use the best experimental practices to arrive at
>an estimate of the population parameters from sample statistics.)
>
>Let's say that we conclude that the distribution results in a .75 prob
>for H and .25 prob for O. (We "throw back" other atoms.)
>
>Then, we can conclude that the Shannon entropy for this distribution is
>-[(.75)*(log2(.75)) + (.25)*(log2(.25))] = .675
>
>So, an interest in Unpredictability/Predictability can be measured by
>Shannon entropy.
>
>(The subject of "degree of dissipation" or of disorganization never
>arises here.)
>
>
>Example III: A compound interest in both dimensions (Organization X
>Predictability) jointly.
>
>Let's go back to Example I above, where we are interested in the
>various ways that H and O can bond.
>
>Suppose that we take that interest a little further an ask... 
>
>"What is the probability distribution of the observed molecules and
>ions involving H and O?"
>
>Now, we have combined our interest in both "organizations" of H and O,
>as well as 
>
>the relative probabilities of their occurrences.
>
>Thus, our probability (sample) space now, by definition, has the
>following possible outcomes: O2, HO, H2O, etc.
>
>And, each has its observed probability, and thus we have a joint
>probability distribution that we can apply Shannon's entropy against.
>
>Depending on the probabilities of each of these "H-O compounds", the
>Shannon entropy may be high or it may be low.
>
>
>HTH,
>
>Grant
>
>
>
>
>
>
>Nicholas Thompson wrote:
>
>  >
>  
>
>Grant – 


>  
>
>   


>  
>
>Glad you are on board, here.  I will read this carefully.  


>  
>
>   


>  
>
>Does this have anything to do with the Realism Idealism
>thing.  Predictibility requires a person to be predicting; organization
>is there even if there is no one there to predict one part from
>another. 


>  
>
>   


>  
>
>N 


>  
>
>   


>  >
>  >
>  
>
>From: <a class="moz-txt-link-abbreviated"
href="#">[email protected]</a>
>[<a class="moz-txt-link-freetext"
href="#">mailto:[email protected]</a>] On Behalf Of Grant Holland
>Sent: Saturday, August 07, 2010 2:06 PM
>To: <a class="moz-txt-link-abbreviated" href="#">[email protected]</a>;
The Friday Morning Applied
>Complexity Coffee Group
>Subject: Re: [FRIAM] entropy and uncertainty, REDUX 


>  
>  
>  
>
>   


>  
>
>Russ - Yes.
>
>
>I use the terms "organizational" and "predictable", rather than
>"structural" and "behavioral", because of my particular interests. They
>amount to the same ideas. Basically they are two orthogonal dimensions
>of certain state spaces as they change.
>
>
>I lament the fact that the same term "entropy" is used to apply to both
>meanings, however. Especially since few realize that these two meanings
>are being conflated with the same word. Von Foerster actually defined
>the word "entropy" in two different places within the same book of
>essays to mean each of these two meanings! Often the word "disorder" is
>used. And people don't know whether "disorder" refers to
>"disorganization" or whether it refers to "unpredictability". This word
>has fostered the further unfortunate confusion. 
>
>
>It seems few people make the distinction that you have. This conflation
>causes no end of confusion. I really wish there were 2 distinct terms.
>In my work, I have come up with the acronym "DOUPBT" for the
>"unpredictable" meaning of entropy. (Or, "behavioral", as you call it.)
>This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's
>formula for this meaning.
>
>
>This all came about because 1) Clausius invented the term entropy to
>mean "dissipation" (a kind of dis-organization, in my terms). 2) But
>then Gibbs came along and started measuring the degree of
>unpredictability involved in knowing the "arrangements" (positions and
>momenta) of molecules in an ideal gas. The linguistic problem was that
>Gibbs (and Boltzmann) use the same term - entropy - as had Clausius,
>even though Clausius emphasized a structural (dissipation) idea,
>whereas Gibbs emphasized an unpredictability idea (admittedly,
>unpredictability of "structural" change).
>
>
>To confuse things even more, Shannon came along and defined entropy in
>purely probabilistic terms - as a direct measure of unpredictability.
>So, historically, the term went from a purely structural meaning, to a
>mixture of structure and unpredictability to a pure unpredictability
>meaning. No wonder everyone is confused.
>
>
>Another matter is that Clausius, Boltzmann and Gibbs were all doing
>Physics. But Shannon was doing Mathematics. 
>
>
>My theory is Mathematics. I'm not doing Physics. So I strictly need
>Shannon's meaning. My "social problem" is that every time I say
>"entropy", too many people assume I'm talking about "dissipation" when
>I am not. I'm always talking about "disorganization" when I use the
>term in my work. So, I have gone to using the phrase "Shannon's
>entropy", and never the word in its naked form. (Admittedly, I
>eventually also combine in a way similar to Gibbs :-[ .
>But I do not refer to the combined result as "entropy".)
>
>:-P 
>
>Grant
>
>
>
>Russ Abbott wrote: 


>  >
>  
>
>Is it fair to say that Grant is talking about what one might call
>structural vs. behavioral entropy?
>
>
>Let's say I have a number of bits in a row. That has very low
>structural entropy. It takes very few bits to describe that row of
>bits. But let's say each is hooked up to a random signal. So
>behaviorally the whole thing has high entropy. But the behavioral
>uncertainty of the bits is based on the assumed randomness of the
>signal generator. So it isn't really the bits themselves that have high
>behavioral entropy. They are just a "window" through which we are
>observing the high entropy randomness behind them.  
>
>
>This is a very contrived example. Is it at all useful for a discussion
>of structural entropy vs. behavioral entropy? I'm asking that in all
>seriousness; I don't have a good sense of how to think about this.
>
>
>This suggests another thought. A system may have high entropy in one
>dimension and low entropy in another. Then what? Most of us are very
>close to the ground most of the time. But we don't stay in one place in
>that relatively 2-dimensional world. This sounds a bit like Nick's
>example. If you know that an animal is female, you can predict more
>about how she will act than if you don't know that. 
>
>
>One other thought Nick talked about gradients and the tendency for them
>to dissipate.  Is that really so? If you put two mutually insoluble
>liquids in a bottle , one heavier than another, the result will be a
>layer cake of liquids with a very sharp gradient between them. Will
>that ever dissipate?
>
>
>What I think is more to the point is that potential energy gradients
>will dissipate. Nature abhors a potential energy gradient -- but not
>all gradients. 


>  >
>  
>
>
>
>-- Russ 


>  
>  
>
>   


>  >
>  
>
>On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <<a moz-do-not-send=""
href="#">[email protected]</a>>
>wrote: 


>  >
>  
>
>Glen is very close to interpreting what I mean to say. Thanks,
>Glen!
>
>
>(But of course, I have to try one more time, since I've  thought of
>another - hopefully more compact - way to approach it...)
>
>
>Logically speaking, "degree of unpredictability" and "degree of
>disorganization" are orthogonal concepts and ought to be able to vary
>independently - at least in certain domains. If one were to develop a
>theory about them (and I am), then that theory should provide for them
>to be able to vary independently. 
>
>
>Of course, for some "applications" of that theory, these
>"predictability/unpredictability" and "organization/disorganization"
>variables may be dependent on each other. For example, in
>Thermodynamics, it may be that the degree unpredictability and the
>degree of disorganization are correlated. (This is how many people seem
>to interpret the second law.) But this is specific to a Physics
>application.
>
>
>However, in other applications, it could be that the degree uncertainty
>and the degree of disorganization vary independently. For example, I'm
>developing a mathematic theory of living and lifelike systems.
>Sometimes in that domain there is a high degree of predictability that
>an organo-chemical entity is organized, and sometimes there is
>unpredictability around that. The same statement goes for
>predictability or unpredictability around disorganization.  Thus, in
>the world of  living systems,  unpredictability and  disorganization
>can vary independently. 
>
>
>To make matters more interesting, these two variables can be joined in
>a joint space. For example, in the "living systems example" we could
>ask about the probability of advancing from a certain disorganized
>state in one moment to a certain organized state in the next moment. In
>fact, we could look at the entire probability distribution of advancing
>from this certain disorganized state at this moment to all possible
>states at the next moment - some of which are more disorganized than
>others. But if we ask this question, then we are asking about a
>probability distribution of states that have varying degrees of
>organization associated with them. But, we also have a probability
>distribution involved now, so we can ask "what is it's Shannon
>entropy?" That is, what is its degree of unpredictability? So we have
>created a joint space that asks about both disorganization and
>unpredictability at the same time. This is what I do in my theory
>("Organic Complex Systems").
>
>
>Statistical Thermodynamics (statistical mechanics) also mixes these two
>orthogonal variables in a similar way. This is another way of looking
>at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about
>the probability distributions of various "arrangements" (organizations)
>of molecules in an ideal gas (these arrangements, states, are defined
>by position and momentum). So he is interested in probabilities of
>various "organizations" of molecules. And, the Gibbs formula for
>entropy is a measurement of this combination of interests. I suspect
>that it is this combination that is confusing to so many. (Does
>"disorder" mean "disorganization", or does it mean "unpredictability".
>In fact, I believe reasonable to say that Gibbs formula measures "the
>unpredictability of being able to talk about which "arrangements" will
>obtain."
>
>
>In fact, Gibbs formula for thermodynamic entropy looks exactly like
>Shannon's - except for the presence of a constant in Gibbs formula.
>They are isomorphic! However, they are speaking to different domains.
>Gibbs is modeling a physics phenomena, and Shannon is modeling a
>mathematical statistics phenomena. The second law applies to Gibbs
>conversation - but not to Shannon's.
>
>
>In my theory, I use Shannon's - but not Gibbs'.
>
>
>(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )
>
>
>Grant 


>  >
>  >
>  
>
>
>
>
>glen e. p. ropella wrote: 


>  
Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:  
>  
    
>  
>    
All of this, it seems to me, can be accommodated by – indeed requires –  
>    
a common language between information entropy and physics entropy, the  
>    
very language which GRANT seems to argue is impossible.  
>    
      
>  
>  
OK.  But that doesn't change the sense much.  Grant seemed to be arguing  
>  
that it's because we use a common language to talk about the two  
>  
concepts, the concepts are erroneously conflated.  I.e. Grant not only  
>  
admits the possibility of a common language, he _laments_ the common  
>  
language because it facilitates the conflation of the two different  
>  
concepts ... unless I've misinterpreted what he's said, of course.  
>  
   
>  
    
>  
>    
I would like to apologize to everybody for these errors.  I am beginning  
>    
to think I am too old to be trusted with a distribution list.  It’s not  
>    
that I don’t go over the posts before I send them … and in fact, what I  
>    
sent represented weeks of thinking and a couple of evenings of drafting  
>    
… believe it or not!  It seems that there are SOME sorts of errors I  
>    
cannot see until they are pointed out to me, and these seem to be, of  
>    
late, the fatal ones.  
>    
      
>  
>  
We're all guilty of this.  It's why things like peer review and  
>  
criticism are benevolent gifts from those who donate their time and  
>  
effort to criticize others.  It's also why e-mail and forums are more  
>  
powerful and useful than the discredit they usually receive.  While it's  
>  
true that face-to-face conversation has higher bandwidth, e-mail,  
>  
forums, and papers force us to think deeply and seriously about what we  
>  
say ... and, therefore think.  So, as embarrassing as "errors" like this  
>  
feel, they provide the fulcrum for clear and critical thinking.  I say  
>  
let's keep making them!  
>  
   
>  
Err with Gusto! ;-)  
>  
   
>  
    
>  
>
>   


>  
>  
>  
--   
>  
Grant Holland  
>  
VP, Product Development and Software Engineering  
>  
NuTech Solutions  
>  
404.427.4759  
>  
>  
>
>
>
>============================================================
>
>FRIAM Applied Complexity Group listserv
>
>Meets Fridays 9a-11:30 at cafe at St. John's College
>
>lectures, archives, unsubscribe, maps at <a moz-do-not-send=""
href="http://www.friam.org"; onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org</a> 


>  
>  
>
>   


>  
>  
   
>  
 
>  
   
>  
============================================================  
>  
FRIAM Applied Complexity Group listserv  
>  
Meets Fridays 9a-11:30 at cafe at St. John's College  
>  
lectures, archives, unsubscribe, maps at <a moz-do-not-send=""
href="http://www.friam.org"; onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org</a>  
>  
>
>
>
>


>  
--   
>  
Grant Holland  
>  
VP, Product Development and Software Engineering  
>  
NuTech Solutions  
>  
404.427.4759  
>  
>  

>============================================================
>FRIAM Applied Complexity Group listserv
>Meets Fridays 9a-11:30 at cafe at St. John's College
>lectures, archives, unsubscribe, maps at <a class="moz-txt-link-freetext"
href="http://www.friam.org"; onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org</a>
>
>
>
-- 
>Grant Holland
>VP, Product Development and Software Engineering
>NuTech Solutions
>404.427.4759
>
============================================================
>FRIAM Applied Complexity Group listserv
>Meets Fridays 9a-11:30 at cafe at St. John's College
>lectures, archives, unsubscribe, maps at http://www.friam.org
>

Eric Charles

Professional Student and
Assistant Professor of Psychology
Penn State University
Altoona, PA 16601


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to