On Tue, Apr 14, 2015 at 12:39 PM, Telmo Menezes <[email protected]>
wrote:

>> My problem with any view based on entropy is that entropy doesn't appear
>> to be fundamental to physics; it is the statistically likely result when
>> objects are put in a certain configuration and allowed to evolve randomly.
>>
>
> > There is, however, an interesting parallel to be made with Shannon's
> entropy, which is a measure of information content and not just a
> statistical effect. Once in the realm of digital physics, it becomes
> questionable if physical entropy and information entropy are separate
> things.
>

I think the second law of thermodynamics is the most fundamental law of
physics, in fact it's almost a law of logic rather than physics; entropy
will always increase just says that there are more ways to be complicated
than simple, so any change in a system will probably make it more
complicated and not simpler. Or to put it in Shannon's language, it takes
more information to describe a complicated thing than a simple thing.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to