Is it fair to say that Grant is talking about what one might call structural
vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural
entropy. It takes very few bits to describe that row of bits. But let's say
each is hooked up to a random signal. So behaviorally the whole thing has
high entropy. But the behavioral uncertainty of the bits is based on the
assumed randomness of the signal generator. So it isn't really the bits
themselves that have high behavioral entropy. They are just a "window"
through which we are observing the high entropy randomness behind them.

This is a very contrived example. Is it at all useful for a discussion of
structural entropy vs. behavioral entropy? I'm asking that in all
seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one
dimension and low entropy in another. Then what? Most of us are very close
to the ground most of the time. But we don't stay in one place in that
relatively 2-dimensional world. This sounds a bit like Nick's example. If
you know that an animal is female, you can predict more about how she will
act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to
dissipate.  Is that really so? If you put two mutually insoluble liquids in
a bottle, one heavier than another, the result will be a layer cake of
liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will
dissipate. Nature abhors a potential energy gradient -- but not all
gradients.


-- Russ


On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland
<[email protected]>wrote:

>  Glen is very close to interpreting what I mean to say. Thanks, Glen!
>
> (But of course, I have to try one more time, since I've  thought of another
> - hopefully more compact - way to approach it...)
>
> Logically speaking, "degree of unpredictability" and "degree of
> disorganization" are orthogonal concepts and ought to be able to vary
> independently - at least in certain domains. If one were to develop a theory
> about them (and I am), then that theory should provide for them to be able
> to vary independently.
>
> Of course, for some "applications" of that theory, these
> "predictability/unpredictability" and "organization/disorganization"
> variables may be dependent on each other. For example, in Thermodynamics, it
> may be that the degree unpredictability and the degree of disorganization
> are correlated. (This is how many people seem to interpret the second law.)
> But this is specific to a Physics application.
>
> However, in other applications, it could be that the degree uncertainty and
> the degree of disorganization vary independently. For example, I'm
> developing a mathematic theory of living and lifelike systems. Sometimes in
> that domain there is a high degree of predictability that an organo-chemical
> entity is organized, and sometimes there is unpredictability around that.
> The same statement goes for predictability or unpredictability around
> disorganization.  Thus, in the world of  living systems,  unpredictability
> and  disorganization can vary independently.
>
> To make matters more interesting, these two variables can be joined in a
> joint space. For example, in the "living systems example" we could ask about
> the probability of advancing from a certain disorganized state in one moment
> to a certain organized state in the next moment. In fact, we could look at
> the entire probability distribution of advancing from this certain
> disorganized state at this moment to all possible states at the next moment
> - some of which are more disorganized than others. But if we ask this
> question, then we are asking about a probability distribution of states that
> have varying degrees of organization associated with them. But, we also have
> a probability distribution involved now, so we can ask "what is it's Shannon
> entropy?" That is, what is its degree of unpredictability? So we have
> created a joint space that asks about both disorganization and
> unpredictability at the same time. This is what I do in my theory ("Organic
> Complex Systems").
>
> Statistical Thermodynamics (statistical mechanics) also mixes these two
> orthogonal variables in a similar way. This is another way of looking at
> what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the
> probability distributions of various "arrangements" (organizations) of
> molecules in an ideal gas (these arrangements, states, are defined by
> position and momentum). So he is interested in probabilities of various
> "organizations" of molecules. And, the Gibbs formula for entropy is a
> measurement of this combination of interests. I suspect that it is this
> combination that is confusing to so many. (Does "disorder" mean
> "disorganization", or does it mean "unpredictability". In fact, I believe
> reasonable to say that Gibbs formula measures "the unpredictability of being
> able to talk about which "arrangements" will obtain."
>
> In fact, Gibbs formula for thermodynamic entropy looks exactly like
> Shannon's - except for the presence of a constant in Gibbs formula. They are
> isomorphic! However, they are speaking to different domains. Gibbs is
> modeling a physics phenomena, and Shannon is modeling a mathematical
> statistics phenomena. The second law applies to Gibbs conversation - but not
> to Shannon's.
>
> In my theory, I use Shannon's - but not Gibbs'.
>
> (Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )
>
> Grant
>
>
> glen e. p. ropella wrote:
>
> Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
>
>
>  All of this, it seems to me, can be accommodated by – indeed requires –
> a common language between information entropy and physics entropy, the
> very language which GRANT seems to argue is impossible.
>
>
>  OK.  But that doesn't change the sense much.  Grant seemed to be arguing
> that it's because we use a common language to talk about the two
> concepts, the concepts are erroneously conflated.  I.e. Grant not only
> admits the possibility of a common language, he _laments_ the common
> language because it facilitates the conflation of the two different
> concepts ... unless I've misinterpreted what he's said, of course.
>
>
>
>  I would like to apologize to everybody for these errors.  I am beginning
> to think I am too old to be trusted with a distribution list.  It’s not
> that I don’t go over the posts before I send them … and in fact, what I
> sent represented weeks of thinking and a couple of evenings of drafting
> … believe it or not!  It seems that there are SOME sorts of errors I
> cannot see until they are pointed out to me, and these seem to be, of
> late, the fatal ones.
>
>
>  We're all guilty of this.  It's why things like peer review and
> criticism are benevolent gifts from those who donate their time and
> effort to criticize others.  It's also why e-mail and forums are more
> powerful and useful than the discredit they usually receive.  While it's
> true that face-to-face conversation has higher bandwidth, e-mail,
> forums, and papers force us to think deeply and seriously about what we
> say ... and, therefore think.  So, as embarrassing as "errors" like this
> feel, they provide the fulcrum for clear and critical thinking.  I say
> let's keep making them!
>
> Err with Gusto! ;-)
>
>
>
>
> --
> Grant Holland
> VP, Product Development and Software Engineering
> NuTech Solutions
> 404.427.4759
>
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> lectures, archives, unsubscribe, maps at http://www.friam.org
>
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to