Dear all,
As the notion of information is again (and interestingly) put on the forefront,
let’s not forget the evolutionary approach that naturally introduces the notion
of meaning and allows to bring in a system oriented perspective.
Assuming we put aside the reason of being of the universe, there is no entity
to care about information before the coming up of life on earth.
Information is a notion that we humans have invented as a set of tools to help
the understanding and managing of our world. And animals also manage
information.
A basic tool is the measurement of the quantity of information with the Shannon
transmission capacity of a channel, whatever the meaning of the information
being transmitted thru the channel.
The meaning of an information can be called many names: content, purpose,
aboutness, goal, target, sense, aim, …
As already presented in the FIS discussions, I feel that the meaning of
information (whatever it’s naming) exists because there is a system that needs
this meaning, a system that creates this meaning or uses it in order to satisfy
a constraint. The system being an animal, a human or an artificial system. The
constraints guiding the meaning generation can be very many. Constraints are
then organic (stay alive, maintain the species, …), human (valorise ego, look
for happiness, …), artificial (obey a process, …). And following such an
approach allows to model meaning generation by a simple system usable for
animals and humans and robots (1), (2).
This does not pretend answering all the questions related to the complex
subject of meaningful information, but it introduces that needed notion in
simple terms.
All the best
Christophe
(1) http://cogprints.org/6279/2/MGS.pdf
(2)
http://www.eucognition.org/uploads/docs/First_Meeting_Hamburg/Workshop_A__menant-web.pdf
> Date: Mon, 30 Nov 2009 08:53:48 +0200
> To: [email protected]; [email protected]
> From: [email protected]
> Subject: Re: [Fis] Asymetry and Information: A modest proposal
>
> At 11:13 PM 2009/11/27, you wrote:
> >Dear Joseph,
> >
> >Be my guest and have some Irish children for breakfast!
> >
> >I did not mean my intervention as directed against substantive theorizing.
> >In addition to a mathematical theory of communication, we need substantive
> >theories of communication. This became clear to me when Maturana formulated
> >life as a consequence of the communication of molecules. If atoms are
> >communicated, one obtains a theory of chemical evolution (Mason), etc. All
> >these special theories of communication can usefully be matched with a
> >mathematical theory of communication (or perhaps more generally non-linear
> >dynamics).
> >
> >The special case, of course, is when one multiplies H with k(B) that one
> >obtains S (Joule/Kelvin). John seems to imply that there is another unit of
> >information in physics which is a conserved entity. John: Can you perhaps
> >provide the dimensionality of this unit and provide the derivation?
>
> Dear Loet,
>
> It is usually defined as a bit, which is understood as a binary distinction,
> wherefore the "it from bit" formulation found in a number of places, but
> the term is due, I believe, to John Wheeler. More typically the term is
> related to entropy considerations (as in the black hole case). My
> derivation is by dimensional analysis. Entropy is the compliment
> of information. If we take the maximal entropy of a system by
> relaxing all constraints with no other change in macroscopic
> parametres (impossible in practice, but possible in the imagination),
> and subtract from this the statistical entropy using Boltzman's
> formulation based on the number of complexions of the system,
> we get negentropy, which can be identified with the information
> in the system. This will break up into two parts, configurational
> and statistical. The it from bit view is usually talking of configurational
> information. The difference between the two is largely a matter of relative
> time scale, butt the time scale differences are typically large, so
> there is a qualitative difference. So negentropy (physical information)
> should be in entropy units. Entropy, as you point out, can be measured
> as joules per degree Kelvin. Going back to basics, joules are energy,
> and degrees Kelvin as average energy per degree of freedom.
> Dividing through by the energy, and correcting for the double denominator,
> we get information in units of degrees of freedom. I submit that bits
> are an excellent measure of degrees of freed, both being pure numbers.
>
> So that is it, information (and entropy) are pure numbers with dimensions
> of degrees of freedom. Boltzman's constant relates this to energy
> measures and other physical values. However, information as
> a measure of degrees of freedom can be used in more abstract
> formulations as well (it implies Shannon's approach, as well as
> all but the required machine dependent part of the computational
> approach). I think it is as fundamental as we can get.
>
> I've argued this all on the list in one place or another before.
>
> John
>
>
> ----------
> Professor John Collier [email protected]
> Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
> T: +27 (31) 260 3248 / 260 2292 F: +27 (31) 260 3031
> http://www.ukzn.ac.za/undphil/collier/index.html
>
> _______________________________________________
> fis mailing list
> [email protected]
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis
_________________________________________________________________
Téléchargez Internet Explorer 8 et surfez sans laisser de trace !
http://clk.atdmt.com/FRM/go/182932252/direct/01/_______________________________________________
fis mailing list
[email protected]
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis