Dear folks,
I think that Koichiro is right. I would say more, though: that the loops just
have to be non-reducible to look a lot like biological things. This is
basically Robert Rosen's position. The sort of loops required aren't just
iterations (that can be decomposed). Rather they are the
Folks,
Doing dimensional analysis entropy is heat difference divided by temperature.
Heat is energy, and temperature is energy per degree of freedom. Dividing, we
get units of inverse degrees of freedom. I submit that information has the same
fundamental measure (this is a consequence of Scott
Dear John and colleagues,
So fundamentally we are talking about the same basic thing with information
and entropy.
The problem is fundamentally: the two are the same except for a constant.
Most authors attribute the dimensionality to this constant (kB).
From the perspective of
Loet,
I think that is consistent with what I said. Different ways of measuring and
perspectives. I prefer to see the unity that comes out of the dimensional
analysis approach, but I was always taught that if you wanted to really
understand something, absorb that first. But my background is in
Folks
I know there is a long legacy of equating information with entropy, and
dimensionally, they are the same. Qualitatively, however, they are
antithetical. From the point of view of statistical mechanics, information
is a *decrease* in entropy, i.e., they are negatives of each other.
This all