On Sun, Feb 19, 2012 at 5:32 AM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

>
> > If one well defines a thought experiment with the Maxwell's demon, then
> it is quite clear that such thing does not exist. Why then to spend on it
> so much time?


Maxwell's demon is possible in classical physics and it was not clear that
quantum mechanics made it impossible until 1929 when Leo Szilard proved
that to be the case. And understanding just why it can not exist aids in
understanding the relationship between energy information entropy and
reversibility. Maxwell's demon was the starting point for Rolf Landauer's
discovery in 1960 that erasing information always requires energy and
increases entropy because it's thermodynamically irreversible.

The bottom line is that Maxwell's demon would work if it had information on
when to open and close its shutter, if it had that information it could
decrease the pressure on one side of a tank filled with gas and increase it
on the other without expending appreciable energy and use that pressure
difference to do work. However the stumbling block is the information, it
would take more energy to obtain that information than you'd get from that
work. Actually by pure chance Maxwell's demon can work, but only very very
rarely and only for a very very short time.

  John K Clark
**








>
>
>
>>> 3) Reversible chemical reactions and reversible thermodynamic
>>> processes
>>>
>>> I think that the author misuses the term reversible in a sense
>>> that the word has completely different meaning in thermodynamics
>>> and in chemistry. In thermodynamics, the reversible process implies
>>> that the entropy of the system and surrounding does not change
>>> (the entropy of the Universe remains constant). In chemistry, a
>>> term reversible reaction means we have two reactions (forward and
>>> backward) running in parallel. Thereafter, by playing with
>>> conditions we could transform A to B and then B back to A.
>>> However, when a reversible chemical reaction takes place it is
>>> impossible to implement it as a reversible thermodynamic process.
>>> Hence a reversible chemical reaction is not thermodynamically
>>> reversible.
>>>
>>>
>> A reversible computation has the same meaning of reversible as in
>> thermodynamics. Change of entropy is zero. Information is conserved.
>> Reversible computations can never erase memory locations, for
>> instance, or implement assignment.
>>
>
> It is hard to say for sure what the author meant. Let me first quote him.
>
> p. 912(8) "It is well known that all chemical reaction are in principle
> reversible: the same Brownian motion that accomplishes the forward reaction
> also sometimes brings product molecules together, pushes them backward
> through the transition state, and lets them emerge as reactant molecules."
>
> p. 934(30) "As indicated before, the synthesis of RNA by RNA polymerase is
> a logically reversible copying operations, and under appropriate
> (nonphysiological) conditions, it could be carried out at an energy cost of
> less than kT per nucleotide."
>
> My understanding was that in the first quote reversible has the meaning
> from chemistry. Let us consider for example a reaction
>
> A = B
>
> with the forward reaction rate of 1000 and the backward reaction rate of
> 1. Then we can imagine two different initial states
>
> 1) C(A) = 1, C(B) = 0
> 2) C(A) = 0, C(B) = 1
>
> The equilibrium state will be the same, but we reach it from different
> sides. In both cases however the process will be thermodynamically
> irreversible.
>
> My point was that one word has different meanings and it would be good to
> understand what has been meant.
>
>
>  4) Algorithmic entropy
>>>
>>> I have missed the point on the connection between the algorithmic
>>> entropy and thermodynamic entropy. Here would be good to be back
>>> to the Jason's example from about his work on secure pseudo-random
>>> number generators
>>>
>>> http://csrc.nist.gov/**publications/nistpubs/800-90A/**SP800-90A.pdf<http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf>
>>>
>>> What a thermodynamic system should be considered at all here?
>>>
>>> In my view, the algorithm is independent of implementation
>>> details. It seems that this is one of the points at this list when
>>> people claim that it could be possible to make a conscious robot.
>>> Yet, how then the thermodynamic entropy could be connected with
>>> the algorithmic entropy?
>>>
>>
>> That seems like a non-sequitur. Could you expand on your thinking
>> please?
>>
>
> I have read once more the section 6 "Algorithmic entropy and
> thermodynamics" (p. 936 (30)) from the paper. I should confess that I do
> not know exactly what the author meant with the algorithmic entropy. My
> reading was
>
> algorithmic entropy == entropy of an algorithm
>
> and I have considered and will stick to this meaning.
>
> In my understanding, when we consider an algorithm, this is a pure IT
> construct, that does not depend whether I will implement it with an abacus
> or some Turing machine, with Intel or PowerPC processor. From this follows
> that the algorithm and hence its entropy does not depend on temperature or
> pressure of a physical system that does the computation. In my view it
> makes sense.
>
> Let us consider consciousness now. Our brains produces it and our brain
> has some thermodynamic entropy. If we assume that the same effect could be
> achieved with some robot, does it mean that the thermodynamic entropy of
> the robot must be the same as that of the brain?
>
>
>
>>> 5) DNA, RNA and information
>>>
>>> I have recently read
>>>
>>> Barbieri, M. (2007). Is the cell a semiotic system? In:
>>> Introduction to Biosemiotics: The New Biological Synthesis. Eds.:
>>> M. Barbieri, Springer: 179-208.
>>>
>>>
>> I'm afraid semiotics leaves me cold. I've never seen one useful
>> conjecture come out of it. Apologies to all those Pearceans out
>> there.
>>
>
> Everything is in comparison. Recently I got interested in Artificial Life
> and people have recommended me Christoph Adami “Introduction to Artificial
> Life“. He for example claims
>
> p. 5 "An even more general approach is the thermodynamic one, which
> attempts to define living systems in terms of their ability to maintain low
> levels of entropy, or disorder, only".
>
> One could say something like this but then the question what is the
> entropy. And this is what Adami writes about the entropy
>
> p. 94 “Entropy is a measure of the disorder present in a system, or
> alternatively, a measure of our lack of knowledge about this system.”
>
> p. 96 “If an observer gains knowledge about the system and thus determines
> that a number of states that were previously deemed probable are in fact
> unlikely, the entropy of the system (which now has turned into a
> conditional entropy), is lowered, simply because the number of different
> possible states in the lower. (Note that such a change in uncertainty is
> usually due to a measurement).
>
> p. 97 “Clearly, the entropy can also depend on what we consider
> “different”. For example, one may count states as different that differ by,
> at most, del_x in some observable x (for example, the color of a ball drawn
> from an ensemble of differently shaded balls in an urn). Such entropies are
> then called fine-grained (if del_x is small), or course-grained (if del_x
> is large) entropies.”
>
> I am a thermodynamicist and frankly speaking I was just shocked after
> reading it. For me it was clear that Adami does not know what the
> experimental thermodynamics is (and presumably he is unaware of the
> experimental thermodynamics at all).
>
> I understand now that Adami's viewpoint is quite common among physicists
> but I do not think that this brings us "useful conjecture come out of it".
> I had discussion about this on the biotaconv list, see summary at
>
> http://blog.rudnyi.ru/2010/12/**entropy-and-artificial-life.**html<http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html>
>
> but no one there could explain me what is the differences in consequences
> in artificial life research between the two statements
>
> 1) The thermodynamic and information entropies are equivalent.
>
> 2) The thermodynamic and information entropies are completely different.
>
> It seems that either 1) or 2) does not influence artificial life research
> at all.
>
> In this sense, I like the Barbieri's paper much more. At least I could
> follow his logic.
>
> Evgenii
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To post to this group, send email to 
> everything-list@googlegroups.**com<everything-list@googlegroups.com>
> .
> To unsubscribe from this group, send email to everything-list+unsubscribe@
> **googlegroups.com <everything-list%2bunsubscr...@googlegroups.com>.
> For more options, visit this group at http://groups.google.com/**
> group/everything-list?hl=en<http://groups.google.com/group/everything-list?hl=en>
> .
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to