Re: [Fis] Information Foundation of the Act--F.Flores L.deMarcos

2015-07-27 Thread John Collier
Dear folks, I think that Koichiro is right. I would say more, though: that the loops just have to be non-reducible to look a lot like biological things. This is basically Robert Rosen's position. The sort of loops required aren't just iterations (that can be decomposed). Rather they are the

Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread John Collier
Folks, Doing dimensional analysis entropy is heat difference divided by temperature. Heat is energy, and temperature is energy per degree of freedom. Dividing, we get units of inverse degrees of freedom. I submit that information has the same fundamental measure (this is a consequence of Scott

Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread Loet Leydesdorff
Dear John and colleagues, So fundamentally we are talking about the same basic thing with information and entropy. The problem is fundamentally: the two are the same except for a constant. Most authors attribute the dimensionality to this constant (kB). From the perspective of

Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread John Collier
Loet, I think that is consistent with what I said. Different ways of measuring and perspectives. I prefer to see the unity that comes out of the dimensional analysis approach, but I was always taught that if you wanted to really understand something, absorb that first. But my background is in

Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread Robert E. Ulanowicz
Folks I know there is a long legacy of equating information with entropy, and dimensionally, they are the same. Qualitatively, however, they are antithetical. From the point of view of statistical mechanics, information is a *decrease* in entropy, i.e., they are negatives of each other. This all