Re: [Fis] Can the can drink beer ? - No way!
Dear Bob. I agree 100%. We must classify information in abiotic and biotic, but, in my opinion, both are parts of "information". Some scientists, including Tom Stonier, tried to define information considering the syntactic and semantic aspects of information. Can we draw a parallel between these concepts? Abiotic <--> syntactic <--> Shannon <--> machines Biotic <--> semantic <--> human Shall we abandon the insights from Shannon theory to construct a theory of information (unified and non reductionist)? Attached a draw to illustrate the Von Newmann - Shannon talking :-) Um abraço Moisés 2017-03-26 11:30 GMT-03:00 Bob Logan: > Hello Krassimir - I agree with the sentiments you expressed - they seem to > parallel my thoughts. > > I am often puzzled by the use of the term ‘information’ in the way it is > often used by physicists re the info of material objects . The way the term > information is used in physics such as Wheelers its from bits does not > conform to my understanding of information as a noun describing the process > of informing. How can abiotic matter be informed as it cannot make any > choices and hence cannot be informed. Living organisms make choices and > use information to make those choices for all living creatures from > bacteria to humans including physicists :-). The only information involved > in the uses by physicists describing our universe of the word information > is that associated with physicists becoming informed of what is happening > in the universe they observe. I am happy that they want to discuss this > info but I believe there is a need to distinguish between info (biotic) and > info (abiotic) as used in physics. The use of a single word information for > both categories is confusing, at least it is for me. This ambiguity reminds > me of Shannon's use of the term entropy to define his notion of information > having taken the advice of Von Neumann. A story is told that Shannon did > not know what to call his measure and von Neumann advised him to call it > entropy because nobody knows what it means and that it would therefore give > Shannon an advantage in any debate (Campbell, Jeremy 1982, p. 32 *Grammatical > Man: Information, Entropy, Language, and Life. *New York: Simon and > Schuster. ). Shannon defined information in such a way that he admitted > was not necessarily about meaning. Information without meaning has no > meaning for me. Kind regards to all - Bob Logan > > > > __ > > Robert K. Logan > Prof. Emeritus - Physics - U. of Toronto > Fellow University of St. Michael's College > Chief Scientist - sLab at OCAD > http://utoronto.academia.edu/RobertKLogan > www.researchgate.net/profile/Robert_Logan5/publications > https://www.physics.utoronto.ca/people/homepages/logan/ > > > > > > > > > > > > > On Mar 26, 2017, at 5:39 AM, Krassimir Markov wrote: > > Dear Brian, Arturo, Karl, Alex, Lars-Goran, Gyuri, and FIS colleagues, > > Thank you for your remarks! > > What is important is that every theory has its own understanding of the > concepts it uses. > For “foreigners”, theirs meaning may be strange or unknown. > Some times, concepts of one theory contradict to corresponded concepts > from other theory. > > For years, I have met many different definitions of concept “information” > and many more kinds of its use. > From materialistic up to weird point of view... > > To clear my own understanding, I shall give you a simple example: > > CAN THE CAN DRINK BEER ? > > CAN THE CAN EXCHANGE BEER WITH THE GLASS ? > > The can is used by humans for some goals, for instance to store some beer > for a given period. > But the can itself “could not understand” its own functions and what the > can can do with beer it contains. > All its functionality is a human’s consciousness model. > Can cannot exchange beer with the glass if there are no human activity or > activity of additional devices invented by humans to support this. > > Further: > > CAN THE ARTIFICIAL LEG WALK ? > You know the answer ... Human with an artificial leg can walk ... > All functionality of artificial leg is a result from human’s > consciousness modeling and invention. > > In addition: > > IS THE “PHYSICAL INFORMATION” INFORMATION ? > If it is, the first question is how to measure the quantity and quality of > such “information” and who can do this? > I prefer the answer “NO” – “physical information” is a concept which means > something else but not “information” as it is in my understanding. > From my point of view, “physical information” is a kind of reflection (see > “Theory of reflections” of T.Pavlov). > Every reflection may be assumed as information iff (if and only if) there > exist a subjective information expectation to be resolved by given > reflection. > For physical information this low is not satisfied. Because of this, I > prefer to call this phenomenon simply “a reflection”. > > And so on ... > > > Finally: > > Human
Re: [Fis] Can the can drink beer ? - No way!
Hello Krassimir - I agree with the sentiments you expressed - they seem to parallel my thoughts. I am often puzzled by the use of the term ‘information’ in the way it is often used by physicists re the info of material objects . The way the term information is used in physics such as Wheelers its from bits does not conform to my understanding of information as a noun describing the process of informing. How can abiotic matter be informed as it cannot make any choices and hence cannot be informed. Living organisms make choices and use information to make those choices for all living creatures from bacteria to humans including physicists :-). The only information involved in the uses by physicists describing our universe of the word information is that associated with physicists becoming informed of what is happening in the universe they observe. I am happy that they want to discuss this info but I believe there is a need to distinguish between info (biotic) and info (abiotic) as used in physics. The use of a single word information for both categories is confusing, at least it is for me. This ambiguity reminds me of Shannon's use of the term entropy to define his notion of information having taken the advice of Von Neumann. A story is told that Shannon did not know what to call his measure and von Neumann advised him to call it entropy because nobody knows what it means and that it would therefore give Shannon an advantage in any debate (Campbell, Jeremy 1982, p. 32 Grammatical Man: Information, Entropy, Language, and Life. New York: Simon and Schuster. ). Shannon defined information in such a way that he admitted was not necessarily about meaning. Information without meaning has no meaning for me. Kind regards to all - Bob Logan __ Robert K. Logan Prof. Emeritus - Physics - U. of Toronto Fellow University of St. Michael's College Chief Scientist - sLab at OCAD http://utoronto.academia.edu/RobertKLogan www.researchgate.net/profile/Robert_Logan5/publications https://www.physics.utoronto.ca/people/homepages/logan/ On Mar 26, 2017, at 5:39 AM, Krassimir Markovwrote: Dear Brian, Arturo, Karl, Alex, Lars-Goran, Gyuri, and FIS colleagues, Thank you for your remarks! What is important is that every theory has its own understanding of the concepts it uses. For “foreigners”, theirs meaning may be strange or unknown. Some times, concepts of one theory contradict to corresponded concepts from other theory. For years, I have met many different definitions of concept “information” and many more kinds of its use. From materialistic up to weird point of view... To clear my own understanding, I shall give you a simple example: CAN THE CAN DRINK BEER ? CAN THE CAN EXCHANGE BEER WITH THE GLASS ? The can is used by humans for some goals, for instance to store some beer for a given period. But the can itself “could not understand” its own functions and what the can can do with beer it contains. All its functionality is a human’s consciousness model. Can cannot exchange beer with the glass if there are no human activity or activity of additional devices invented by humans to support this. Further: CAN THE ARTIFICIAL LEG WALK ? You know the answer ... Human with an artificial leg can walk ... All functionality of artificial leg is a result from human’s consciousness modeling and invention. In addition: IS THE “PHYSICAL INFORMATION” INFORMATION ? If it is, the first question is how to measure the quantity and quality of such “information” and who can do this? I prefer the answer “NO” – “physical information” is a concept which means something else but not “information” as it is in my understanding. From my point of view, “physical information” is a kind of reflection (see “Theory of reflections” of T.Pavlov). Every reflection may be assumed as information iff (if and only if) there exist a subjective information expectation to be resolved by given reflection. For physical information this low is not satisfied. Because of this, I prefer to call this phenomenon simply “a reflection”. And so on ... Finally: Human been invented too much kinds of prostheses including ones for our intellectual functionalities, i.e. many different kinds of electronic devices which, in particular, can generate some electrical, light, etc. impulses, which we assume as “information”; usually a combination of impulses we assume as s structure to be recognized by us as “information”. A special kind of prostheses are Robots. They have some autonomous functionalities but are still very far from living consciousness. The level of complexity of robot’s consciousness is far of human’s one. Someone may say that robots understand and exchange “information”, but still they only react on incoming signals following the instructions given by humans. Theirs functioning is similar to human ones but only similar. They may