Re: [Fis] Shannonian Mechanics? - Species specific?
Dear Jerry, At the risk of being jailed by Pedro, let me point to the beauty of the example: >From a molecular biological perspective, the assertion of “same encoding” of >information is contrary to fact. OK: the coding of the information is species specific; both theoretically and empirically. I fully agree. But this argument cannot carry the inference that the information (to be coded) is species specific. If one wishes to define information as “a difference which makes a difference”, reference systems for both differences have to be specified. Differences(1) can make a difference(2) for a system of reference (receiver). The latter system can receive the information and code it, or the information can be discarded as noise. Noise or probabilistic entropy can be defined as differences(1) without difference(2). A set of differences(1) can be considered as a probability distribution which is yet meaningless; that is, Shannon-type information. Distinguishing between the coding (= diff2 operating on diff1) and the coded differences(1) is a condition for analytical clarity. Otherwise, one uses the same word for two different concepts and confusion is expected to prevail. The idea that one can reconcile two analytical different concept in a “universal” theory is mistaken. Best, Loet ___ Fis mailing list Fis@listas.unizar.es http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
Re: [Fis] Shannonian Mechanics? - Species specific?
List: > Your claim that information is SPECIES SPECIFIC is completely at variance > with the EMPIRICAL EVIDENCE that I presented in my 3 week session that the > minds of different animal species have used the same encoding of gestalt > forms for the past 400 million years since the evolution of the amniotes. > Pedro’s assertion that biological information is species specific is amply supported by massive amounts of molecular biological evidence. One of the critical “differences that make a difference” between species is that each member of a specific species has a DNA sequence that is compatible with reproduction within the species. (Even though the concept of a species is that of homology of individuals, not homogeneity of individuals.) From a molecular biological perspective, the assertion of “same encoding” of information is contrary to fact. Cheers jerry > On Jun 30, 2016, at 11:45 PM, Alex Hankey wrote: > > Pedro suggested that I send these comments to the whole group, so here they > are > > -- Forwarded message -- > From: "Alex Hankey" mailto:alexhan...@gmail.com>> > Date: 29 Jun 2016 21:20 > Subject: Re: [Fis] Shannonian Mechanics? > To: "Pedro C. Marijuan" <mailto:pcmarijuan.i...@aragon.es>> > Cc: > > Dear Pedro, > > Your claim that information is SPECIES SPECIFIC is completely at variance > with the EMPIRICAL EVIDENCE that I presented in my 3 week session that the > minds of different animal species have used the same encoding of gestalt > forms for the past 400 million years since the evolution of the amniotes. > > Study of response of plants to human intentions has simlar implications > related to Rupert Sheldrake's 'Sense of being stared at'. These WELL > authenticated phenomena have hugely important implications for our > understanding of information in Experience - the topic of my presentation. > Best wishes, > Alex Hankey > > On 29 Jun 2016 4:24 pm, "Pedro C. Marijuan" <mailto:pcmarijuan.i...@aragon.es>> wrote: > Dear Marcus, Loet, Bob... and All, > > Again very briefly, your exchanges make clear the limits of the received > Shannonian approach and the (narrow?) corridors left for advancement. I find > this situation highly reminiscent of what happened with Mechanics long ago: > an excellent theory (but of limited scope) was overstretched and used as a > paradigm of what All science should be... it contributed well to technology > and to some other natural science disciplines, but was far from useful > --nefarious?-- for humanities and for the future of psychological and social > science studies. > > The figure from Weaver in Loet's excellent posting leaves a few aspects > outside. The why, the what, the how long, the with whom, and other aspects of > the information phenomenon do not enter. By doing that we have streamlined > the phenomenon... and have left it ready for applying a highly successful > theory, in the technological and in many other realms (linguistics, artif. > intelligence, neurodynamics, molec. networks, ecol. networks, applied soc. > metrics, etc). Pretty big and impressive, but is it enough? Shouldn't we try > to go beyond? > > I wonder whether a far wider "phenomenology of information" is needed > (reminding what Maxine argued months ago about the whole contemplation of our > own movement, or Plamen about the "war on cancer"?). If that inquiry is > successful we could find for instance that: > > 1. There are UNIVERSALS of information. Not only in the transmission or in > the encoding used, well captured by the present theory, but also in the > generation, in the "purpose", the "meaning", the targeted subject/s, in the > duration, the cost, the value, the fitness or adaptive "intelligence", etc. > > 2. Those UNIVERSALS are SPECIES' SPECIFIC. > > 3. Those UNIVERSALS would be organized, wrapped, around an ESSENTIAL CORE. It > would consist in the tight ingraining of self-production and communication > (almost inseparable, and both information based!). In the human special case, > it is the whole advancement of our own lives what propels us to engage in > endless communication --about the universals of our own species-- but with > the terrific advantage of an open-ended communication system, language. > > 4. Those UNIVERSALS would have been streamlined in very different ways and > taken as "principles" or starting points for a number of > disciplines--remembering the discussion about the four Great Domains of > Science. A renewed Information Science should nucleate one of those domains. > > Best regards to all,
Re: [Fis] Shannonian Mechanics?
I think complaining about Shannon entropy as a measure of information is completely justified because it is steam-engine physics unfortunately still widely used despite its many flaws and limitations. But to think that Shannon entropy is at the front-end in the mathematical discussion of information is a mistake and this, and other groups, have perpetually been entrapped in a 60s and 70s discussion on a fake ancient theory of information that not even Shannon himself thought was worth to be used for anything meaningful in information but for communication measuring purposes only. Indeed, Shannon entropy is nothing else but a counting function of states/symbols, at best it is a measure of diversity, a bound on information transfer. The technical and philosophical discussion here and everywhere else should be (and has been among those at the scientific front) focused on what has been done in the last 50 years to leave Shannon entropy behind, but nobody here (and almost nowhere else) are people discuss about algorithmic randomness, Levin's universal distribution, measures of sophistication, etc. but prefer to be in a continuous state of pre 60s Shannon entropy discussion. Shannon entropy should not even be mentioned any longer in serious discussions about information, we moved on a long time ago (unfortunately not even many physicists have done) Trying to be constructive. All best, - Hector http://www.hectorzenil.net/ On Wed, Jun 29, 2016 at 3:16 PM, joe.brenner wrote: > Dear Loet, > The way you have asked it, I think the answer to your question is known: > both order and disorder are universals, linked dialectically. Never one > without the other, as for symmetry and asymmetry, except in trivially > simple cases. > Cheers, > Joseph > > > Sent from Samsung Mobile. > > > Original message > From: Loet Leydesdorff > Date:29/06/2016 14:40 (GMT+01:00) > To: "'Pedro C. Marijuan'" , fis@listas.unizar.es > Subject: Re: [Fis] Shannonian Mechanics? > > Dear Pedro and colleagues, > > > > The figure from Weaver in Loet's excellent posting leaves a few aspects > outside. The why, the what, the how long, the with whom, and other aspects > of the information phenomenon do not enter. By doing that we have > streamlined the phenomenon... and have left it ready for applying a highly > successful theory, in the technological and in many other realms > (linguistics, artif. intelligence, neurodynamics, molec. networks, ecol. > networks, applied soc. metrics, etc). Pretty big and impressive, but is it > enough? Shouldn't we try to go beyond? > > In my opinion, “The why, the what, the how long, the with whom, and other > aspects …” are subject to substantive theorizing. The type of answers will > be very different when studying biological or other systems of reference. > But then the information is provided with meaning by these theories and we > discuss “meaningful information” as different from Shannon-type > information. There will in this case a dimension to the information. > > > > For example, when particles collide, there is exchange of momenta and > energy. The dissipation is then dimensioned as Joule/Kelvin (S = k H). In > chemistry one assumes a mass balance and thus a redistribution of atoms > over molecules, etc. The dimensionality of interhuman communication is > hitherto not specified. > > > I wonder whether a far wider "phenomenology of information" is needed > (reminding what Maxine argued months ago about the whole contemplation of > our own movement, or Plamen about the "war on cancer"?). If that inquiry is > successful we could find for instance that: > > This is not successful. It does not lead to a research program, but to > “philosophie spontanée des savant” (Althusser) as your comprehensive > question for “The why, the what, the how long, the with whom, and other > aspects” illustrates. The hidden program is biologistic: > > > 2. Those UNIVERSALS are SPECIES' SPECIFIC. > > > > “ESSENTIAL CORES” are discipline specific! > > > 3. Those UNIVERSALS would be organized, wrapped, around an ESSENTIAL CORE. > It would consist in the tight ingraining of self-production and > communication (almost inseparable, and both information based!). In the > human special case, it is the whole advancement of our own lives what > propels us to engage in endless communication --about the universals of our > own species-- but with the terrific advantage of an open-ended > communication system, language. > > 4. Those UNIVERSALS would have been streamlined in very different ways and > taken as "principles" or starting points for a number of > disciplines--remembering the discussion about the four Great Domains of
Re: [Fis] Shannonian Mechanics?
Dear Loet, The way you have asked it, I think the answer to your question is known: both order and disorder are universals, linked dialectically. Never one without the other, as for symmetry and asymmetry, except in trivially simple cases. Cheers, Joseph Sent from Samsung Mobile. Original message From: Loet Leydesdorff Date:29/06/2016 14:40 (GMT+01:00) To: "'Pedro C. Marijuan'" , fis@listas.unizar.es Subject: Re: [Fis] Shannonian Mechanics? Dear Pedro and colleagues, The figure from Weaver in Loet's excellent posting leaves a few aspects outside. The why, the what, the how long, the with whom, and other aspects of the information phenomenon do not enter. By doing that we have streamlined the phenomenon... and have left it ready for applying a highly successful theory, in the technological and in many other realms (linguistics, artif. intelligence, neurodynamics, molec. networks, ecol. networks, applied soc. metrics, etc). Pretty big and impressive, but is it enough? Shouldn't we try to go beyond? In my opinion, “The why, the what, the how long, the with whom, and other aspects …” are subject to substantive theorizing. The type of answers will be very different when studying biological or other systems of reference. But then the information is provided with meaning by these theories and we discuss “meaningful information” as different from Shannon-type information. There will in this case a dimension to the information. For example, when particles collide, there is exchange of momenta and energy. The dissipation is then dimensioned as Joule/Kelvin (S = k H). In chemistry one assumes a mass balance and thus a redistribution of atoms over molecules, etc. The dimensionality of interhuman communication is hitherto not specified. I wonder whether a far wider "phenomenology of information" is needed (reminding what Maxine argued months ago about the whole contemplation of our own movement, or Plamen about the "war on cancer"?). If that inquiry is successful we could find for instance that: This is not successful. It does not lead to a research program, but to “philosophie spontanée des savant” (Althusser) as your comprehensive question for “The why, the what, the how long, the with whom, and other aspects” illustrates. The hidden program is biologistic: 2. Those UNIVERSALS are SPECIES' SPECIFIC. “ESSENTIAL CORES” are discipline specific! 3. Those UNIVERSALS would be organized, wrapped, around an ESSENTIAL CORE. It would consist in the tight ingraining of self-production and communication (almost inseparable, and both information based!). In the human special case, it is the whole advancement of our own lives what propels us to engage in endless communication --about the universals of our own species-- but with the terrific advantage of an open-ended communication system, language. 4. Those UNIVERSALS would have been streamlined in very different ways and taken as "principles" or starting points for a number of disciplines--remembering the discussion about the four Great Domains of Science. A renewed Information Science should nucleate one of those domains. “Should” is an expression of uneasiness? In my opinion, the assumption of an origin is problematic: order is not given (ex ante) and then branching, but emerging (ex post) from disorder (entropy). Is “disorder” perhaps a universal? In which specific system? (I would have a provisional answer/ hypothesis; but this is my second penny for this week.) Best, Loet Best regards to all, (and particular greetings to the new parties joined for this discussion) --Pedro El 27/06/2016 a las 12:43, Marcus Abundis escribió: Dear Loet, I hoped to reply to your posts sooner as of all the voices on FIS I often sense a general kinship with your views. But I also confess I have difficulty in precisely grasping your views – the reason for my delay. >[while Shannon’s] concept of information (uncertainty) < > is counter-intuitive. It enables us among other things < > to distinguish between "information" and "meaningful < > information". < • Easily agreed; *how* to distinguish a presumed meaning (or meaningless-ness) then becomes the remaining issue. > Providing . . . meaning presumes the specification < > of a system of reference; for example, an observer.< • It is telling for me (in viewing our differences and likenesses) that you suggest an observer. My “system of relating“ accommodates but does not require an observer (okay – observer, defined how?), as shown immediately below. >Different[ly] . . . expected information is dimensionless< > ("a priori"). < • I suggest the act of “expectation“ already infers minimal dimensions – for example, who/what/how is doing the expecting? Thus, in my view, this is not truly a priori. A “
Re: [Fis] Shannonian Mechanics?
Dear Pedro and colleagues, The figure from Weaver in Loet's excellent posting leaves a few aspects outside. The why, the what, the how long, the with whom, and other aspects of the information phenomenon do not enter. By doing that we have streamlined the phenomenon... and have left it ready for applying a highly successful theory, in the technological and in many other realms (linguistics, artif. intelligence, neurodynamics, molec. networks, ecol. networks, applied soc. metrics, etc). Pretty big and impressive, but is it enough? Shouldn't we try to go beyond? In my opinion, The why, the what, the how long, the with whom, and other aspects are subject to substantive theorizing. The type of answers will be very different when studying biological or other systems of reference. But then the information is provided with meaning by these theories and we discuss meaningful information as different from Shannon-type information. There will in this case a dimension to the information. For example, when particles collide, there is exchange of momenta and energy. The dissipation is then dimensioned as Joule/Kelvin (S = k H). In chemistry one assumes a mass balance and thus a redistribution of atoms over molecules, etc. The dimensionality of interhuman communication is hitherto not specified. I wonder whether a far wider "phenomenology of information" is needed (reminding what Maxine argued months ago about the whole contemplation of our own movement, or Plamen about the "war on cancer"?). If that inquiry is successful we could find for instance that: This is not successful. It does not lead to a research program, but to philosophie spontanée des savant (Althusser) as your comprehensive question for The why, the what, the how long, the with whom, and other aspects illustrates. The hidden program is biologistic: 2. Those UNIVERSALS are SPECIES' SPECIFIC. ESSENTIAL CORES are discipline specific! 3. Those UNIVERSALS would be organized, wrapped, around an ESSENTIAL CORE. It would consist in the tight ingraining of self-production and communication (almost inseparable, and both information based!). In the human special case, it is the whole advancement of our own lives what propels us to engage in endless communication --about the universals of our own species-- but with the terrific advantage of an open-ended communication system, language. 4. Those UNIVERSALS would have been streamlined in very different ways and taken as "principles" or starting points for a number of disciplines--remembering the discussion about the four Great Domains of Science. A renewed Information Science should nucleate one of those domains. Should is an expression of uneasiness? In my opinion, the assumption of an origin is problematic: order is not given (ex ante) and then branching, but emerging (ex post) from disorder (entropy). Is disorder perhaps a universal? In which specific system? (I would have a provisional answer/ hypothesis; but this is my second penny for this week.) Best, Loet Best regards to all, (and particular greetings to the new parties joined for this discussion) --Pedro El 27/06/2016 a las 12:43, Marcus Abundis escribió: Dear Loet, I hoped to reply to your posts sooner as of all the voices on FIS I often sense a general kinship with your views. But I also confess I have difficulty in precisely grasping your views the reason for my delay. >[while Shannons] concept of information (uncertainty) < > is counter-intuitive. It enables us among other things < > to distinguish between "information" and "meaningful < > information". < Easily agreed; *how* to distinguish a presumed meaning (or meaningless-ness) then becomes the remaining issue. > Providing . . . meaning presumes the specification < > of a system of reference; for example, an observer.< It is telling for me (in viewing our differences and likenesses) that you suggest an observer. My system of relating accommodates but does not require an observer (okay observer, defined how?), as shown immediately below. >Different[ly] . . . expected information is dimensionless< > ("a priori"). < I suggest the act of expectation already infers minimal dimensions for example, who/what/how is doing the expecting? Thus, in my view, this is not truly a priori. A readiness or a compelling functional need innate to any system of relating has bearing. For example, a single Oxygen atom has a compelling/innate need to react with other elements, just as any agent is compelled to react to nutrients. Both imply dimensional expectations, no? (obviously of different orders/types). > In my opinion, a "real theory of meaning" should enable < > us to specify/measure meaning as redundancy / reduction < > of uncertainty given . . . I took this further in . . . < > The Self-Organization of Meaning and the Reflexive . . .< My weak grasp of the concepts in this paper leads me to
[Fis] Shannonian Mechanics?
Dear Marcus, Loet, Bob... and All, Again very briefly, your exchanges make clear the limits of the received Shannonian approach and the (narrow?) corridors left for advancement. I find this situation highly reminiscent of what happened with Mechanics long ago: an excellent theory (but of limited scope) was overstretched and used as a paradigm of what All science should be... it contributed well to technology and to some other natural science disciplines, but was far from useful --nefarious?-- for humanities and for the future of psychological and social science studies. The figure from Weaver in Loet's excellent posting leaves a few aspects outside. The why, the what, the how long, the with whom, and other aspects of the information phenomenon do not enter. By doing that we have streamlined the phenomenon... and have left it ready for applying a highly successful theory, in the technological and in many other realms (linguistics, artif. intelligence, neurodynamics, molec. networks, ecol. networks, applied soc. metrics, etc). Pretty big and impressive, but is it enough? Shouldn't we try to go beyond? I wonder whether a far wider "phenomenology of information" is needed (reminding what Maxine argued months ago about the whole contemplation of our own movement, or Plamen about the "war on cancer"?). If that inquiry is successful we could find for instance that: 1. There are UNIVERSALS of information. Not only in the transmission or in the encoding used, well captured by the present theory, but also in the generation, in the "purpose", the "meaning", the targeted subject/s, in the duration, the cost, the value, the fitness or adaptive "intelligence", etc. 2. Those UNIVERSALS are SPECIES' SPECIFIC. 3. Those UNIVERSALS would be organized, wrapped, around an ESSENTIAL CORE. It would consist in the tight ingraining of self-production and communication (almost inseparable, and both information based!). In the human special case, it is the whole advancement of our own lives what propels us to engage in endless communication --about the universals of our own species-- but with the terrific advantage of an open-ended communication system, language. 4. Those UNIVERSALS would have been streamlined in very different ways and taken as "principles" or starting points for a number of disciplines--remembering the discussion about the four Great Domains of Science. A renewed Information Science should nucleate one of those domains. Best regards to all, (and particular greetings to the new parties joined for this discussion) --Pedro El 27/06/2016 a las 12:43, Marcus Abundis escribió: Dear Loet, I hoped to reply to your posts sooner as of all the voices on FIS I often sense a general kinship with your views. But I also confess I have difficulty in precisely grasping your views – the reason for my delay. >[while Shannon’s] concept of information (uncertainty) < > is counter-intuitive. It enables us among other things < > to distinguish between "information" and "meaningful < > information". < • Easily agreed; *how* to distinguish a presumed meaning (or meaningless-ness) then becomes the remaining issue. > Providing . . . meaning presumes the specification < > of a system of reference; for example, an observer.< • It is telling for me (in viewing our differences and likenesses) that you suggest an observer. My “system of relating“ accommodates but does not require an observer (okay – observer, defined how?), as shown immediately below. >Different[ly] . . . expected information is dimensionless< > ("a priori"). < • I suggest the act of “expectation“ already infers minimal dimensions – for example, who/what/how is doing the expecting? Thus, in my view, this is not truly a priori. A “readiness“ or a compelling functional need innate to any “system of relating“ has bearing. For example, a single Oxygen atom has a compelling/innate need to react with other elements, just as any agent is compelled to react to “nutrients.“ Both imply dimensional expectations, no? (obviously – of different orders/types). > In my opinion, a "real theory of meaning" should enable < > us to specify/measure meaning as redundancy / reduction < > of uncertainty given . . . I took this further in . . . < > The Self-Organization of Meaning and the Reflexive . . .< • My weak grasp of the concepts in this paper leads me to think you are actually modeling the “processing of meaning,“ related-to-but-distinct-from “generating meaning“ that I target. I also vaguely recall(?) in an offline exchange I asked you if you saw this paper as presenting a “theory of meaning“ and you answered “No.“ • In your later response to Pedro, I found your citation matrix a interesting example of your thinking, but still too “high-order“ for my reductive-but-meaningful aim. Your matrix (for me) presents an essential complexity of high-order views, but in itself it is too simple to detail *how* a citatio