Dear Loet, all, Pedro has just helped me significantly: it's what he refers to as "interlocking goals" (although I'm not sure about "goals" - too teleological?)
What does that mean? Let's think about a transducer we are all familiar with (and one which I am doing a lot of work around at the moment - https://news.liverpool.ac.uk/2018/01/31/liverpool-lead-international-diabetic-eye-disease-research-project/ ) The eye is an operationally-closed system perturbed by its environment. It converts these perturbations into signals which (via many other transducers - neurons, etc) we determine what we are looking at. The fact that we determine that the thing we are looking at is a "thing", and its thingness (the category, the distinction) is stable indicates that the transduction is an ongoing process: indeed, it not only determines the thingness of the thing, but I-ness (and eye-ness!) of me, the observer. If we'd taken LSD, we can mess with our transducers, and the thingness of the thing might appear to be unstable. The thingness of what we see is the product of the interlocking goals of the transducers of the eye in its environment. In my project, I'm having to worry about the transductions in the discourse-related judgements of doctors looking at scans of eyes (taken through cameras... another transducer!). In more detail, the eye is a complex system, but its environment is more complex. The transduction process must involve attenuation of the perturbations from the environment: which ones to deal with? Which ones to ignore? However, if it was just attenuative, then we would not survive very long: some unnoticed (attenuated-out) catastrophe would soon see us off! So something else must happen alongside attenuation. Stafford Beer called this "amplification" - but the electronics analogy is perhaps misleading. Amplification refers to the generative capacity derived from the attenuated information. In reality, amplification means "adding redundancy" (some of this redundancy may involve actions in the world). So attenuation is leaving things out, amplification is adding redundancy to the attenuated description. But that's not quite it either. Because there will be a difference, or error, between the amplified descriptions and the actual perturbations. Transduction, then, is continually adjusting its amplifications and attenuations to produce a stable state. The thing is recursive: news of error at one transducer is passed on to other transducers - this is McCulloch's neural network; it's what the neurons do to the visual signal. As an aside, Bill Powers's Perceptual Control Theory could well be right ( https://en.wikipedia.org/wiki/Perceptual_control_theory) What does Shannon say? Shannon and Weaver (1998) The Mathematical Theory of Communication, University of Illinois Press p57 "Either of these [sender and receiver] will be called a discrete transducer. The input to the transducer is a sequence of input symbols and its output a sequence of output symbols. The transducer may have an internal memory so that its output depends not only on the present input symbol but also on the past history. We assume that the internal memory is finite, i.e. there exist a finite number m of possible states of the transducer and that its output is a function of the present state and the present input symbol. The next state will be a second function of these two quantities. Thus a transducer can be described by two functions: y(n) = f(x(n), α(n)) α(n+1) = g (x(n), α(n)) where x(n) is the nth input symbol α(n) is the state of the transducer when the nth symbol is introduced y(n) is the output symbol (or sequence of output symbols) produced when x(n) is introduced if the state is α(n) If the output state of the one transducer can be identified with the input symbols of a second, they can be connected in tandem and the result is also a transducer." Although Shannon's idea of "memory" in the transducer is specifically related to his engineering challenge, the emergent state of the transducer is basically a generative model which produces output according to the input. In order to compensate for a noisy connection, one of the functions of the transducer is to add redundancy to the communication. Luhmann, of course, based his theory on Maturana's structural coupling. But what's that really? It's "interlocking goals" again, isn't it? Luhmann rightly sees the dynamics of discourse emerging from structural coupling between the social and psychic systems, double-contingency, etc... but isn't that just a complex way of saying "There are multiple recursive transductions in communication - some in peoples' heads, and some in the conversations between people across different media and contexts"? (Conversation is a transduction process). Again, if we all took LSD, it would all go haywire! I suspect out priority in life is to determine which transducers to tweak, how much, when and how long... and which ones to leave alone! Best wishes, Mark On 4 March 2018 at 15:41, Loet Leydesdorff <l...@leydesdorff.net> wrote: > Dear Mark, > > Can you, please, explain "transduction" in more detail? Perhaps, you can > also provide examples? > > Best, > Loet > > > ------------------------------ > > Loet Leydesdorff > > Professor emeritus, University of Amsterdam > Amsterdam School of Communication Research (ASCoR) > > l...@leydesdorff.net ; http://www.leydesdorff.net/ > Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of > Sussex; > > Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, > Hangzhou; Visiting Professor, ISTIC, > <http://www.istic.ac.cn/Eng/brief_en.html>Beijing; > > Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London; > http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en > > > ------ Original Message ------ > From: "Mark Johnson" <johnsonm...@gmail.com> > To: "Loet Leydesdorff" <l...@leydesdorff.net> > Cc: y...@pku.edu.cn; "FIS Group" <fis@listas.unizar.es> > Sent: 3/4/2018 1:03:17 PM > Subject: Re: [Fis] A Paradox > > Dear Loet, all, > > I agree with this. Our construction of reality is never that of a single > system: there are always multiple systems and they interfere with each > other in the way that you suggest. I would suggest that behind all the > ins-and-outs of codification or information and meaning is a very simple > principle of transduction. I often wonder if Luhmann’s theory isn’t really > that different from Shannon’s (who talks about transduction endlessly). The > fact that you've made this connection explicit and empirically justifiable > is, I think, the most important aspect of your work. You may disagree, but > if we kept transduction and jettisoned the rest of Luhmann's theory, I > think we still maintain the essential point. > > There is some resonance (interesting word!) with McCulloch’s model of > perception, where he considered “drome” (literally, “course-ing”, > “running”) circuits each bearing on the other: http://vordenker.de/ > ggphilosophy/mcculloch_heterarchy.pdf (look at the pictures on pages 2 > and 3) Perception, he argued was a *syn-*drome: a combination of > inter-effects between different circuits. There is a logic to this, but it > is not the logic of set theory. McCulloch wrote about it. I think it’s not > a million miles away from Joseph’s/Lupasco’s logic. > Best wishes, > > Mark > > On 4 March 2018 at 07:03, Loet Leydesdorff <l...@leydesdorff.net> wrote: > >> >> Dear Xueshan Yan, >> >> May I suggest moving from a set-theoretical model to a model of two (or >> more) helices. The one dimension may be the independent and the other the >> dependent variable at different moments of time. One can research this >> empirically; for example, in bodies of texts. >> >> In my own models, I declare a third level of codes of communication >> organizing the meanings in different directions. Meaning both codes the >> information and refers to horizons of meaning being specifically coded. >> >> Might this work as an answer to your paradox? >> >> Best, >> Loet >> >> ------------------------------ >> >> Loet Leydesdorff >> >> Professor emeritus, University of Amsterdam >> Amsterdam School of Communication Research (ASCoR) >> >> l...@leydesdorff.net ; http://www.leydesdorff.net/ >> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of >> Sussex; >> >> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, >> Hangzhou; Visiting Professor, ISTIC, >> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing; >> >> Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London; >> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en >> >> >> ------ Original Message ------ >> From: "Xueshan Yan" <y...@pku.edu.cn> >> To: "FIS Group" <fis@listas.unizar.es> >> Sent: 3/4/2018 2:17:01 AM >> Subject: Re: [Fis] A Paradox >> >> Dear Dai, Søren, Karl, Sung, Syed, Stan, Terry, and Loet, >> >> I am sorry to reply you late, but I have thoroughly read every post about >> the paradox and they have brought me many inspirations, thank you. Now I >> offer my responses as follows: >> >> Dai, metaphor research is an ancient topic in linguistics, which reveals >> the relationship between tenor and vehicle, ground and figure, target and >> source based on rhetoric. But where is our information? It looks like Syed >> given the answer: "Information is the container of meaning." If I >> understand it right, we may have this conclusion from it: Information is >> the carrier of meaning. Since we all acknowledge that sign is the carrier >> of information, the task of our Information Science will immediately become >> something like an intermediator between Semiotics (study of sign) and >> Semantics (study of meaning), this is what we absolutely want not to see. >> For a long time, we have been hoping that the goal of Information Science >> is so basic that it can explain all information phenomenon in the >> information age, it just like what Sung expects, which was consisted of >> axioms, or theorems or principles, so it can end all the debates on >> information, meaning, data, etc., but according to this view, it is very >> difficult to complete the missions. Syed, my statement is "A grammatically >> correct sentence CONTAINS information rather than the sentence itself IS >> information." >> >> Søren believes that the solution to this paradox is to establish a new >> discipline which level is more higher than the level of Information Science >> as well as Linguistics, such as his Cybersemiotics. I have no right to >> review your opinion, because I haven't seen your book Cybersemiotics, I >> don't know its content, same as I don't know what the content of >> Biosemiotics is, but my view is that Peirce's Semiotics can't dissolve this >> paradox. >> >> Karl thought: "Information and meaning appear to be like key and lock." >> which are two different things. Without one, the existence of another will >> lose its value, this is a bit like the paradox about hen and egg. I don't >> know how to answer this point. However, for your "The text may be an >> information for B, while it has no information value for A. The difference >> between the subjective." "‘Information’ is synonymous with ‘new’." these >> claims are the classic debates in Information Science, a typical example is >> given by Mark Burgin in his book: "A good mathematics textbook contains a >> lot of information for a mathematics student but no information for a >> professional mathematician." For this view, Terry given his good answer: >> One should firstly label what context and paradigm they are using to define >> their use of the term "information." I think this is effective and first >> step toward to construct a general theory about information, if possible. >> >> For Stan's "Information is the interpretation of meaning, so transmitted >> information has no meaning without interpretation." I can only disagree >> with it kindly. The most simple example from genetics is: an egg cell >> accepts a sperm cell, a fertilized egg contains a set of effective genetic >> information from paternal and maternal cell, here information transmission >> has taken place, but is there any "meaning" and "explanation"? We should be >> aware that meaning only is a human or animal phenomena and it does not be >> used in any other context like plant or molecule or cell etc., this is the >> key we dissolve the paradox. >> >> In general, I have not seen any effective explanation of this paradox so >> far. >> >> >> >> Best wishes, >> >> Xueshan >> >> >> >> *From:* Syed Ali [mailto:doctorsyedal...@gmail.com] >> *Sent:* Tuesday, February 27, 2018 8:10 PM >> *To:* Sungchul Ji <s...@pharmacy.rutgers.edu> >> *Cc:* Terrence W. DEACON <dea...@berkeley.edu>; Xueshan Yan < >> y...@pku.edu.cn>; FIS Group <fis@listas.unizar.es> >> *Subject:* Re: [Fis] A Paradox >> >> >> >> Dear All: >> >> If a non English speaking individual saw the newspaper headline “*Earthquake >> Occurred in Armenia Last Night*”: would that be "information?" >> >> My belief is - Yes. But he or she would have no idea what it was about- >> the meaning would be : Possibly "something " as opposed to the meaning an >> English speaking individual would draw. >> >> In both situations there would be still be meaning - A for the non >> English speaking and B for the English speaking. >> >> >> >> Conclusion: Information is the container of meaning. >> >> >> >> Please critique. >> >> >> >> Syed >> >> >> *Confidential: This email and any files transmitted with it are >> confidential and are intended solely for the use of the individual or >> entity to whom this email is addressed. If you are not one of the named >> recipient(s) or otherwise have reason to believe that you have received >> this message in error, please notify the sender and delete this message >> immediately from your computer. Any other use, retention, dissemination, >> forward, printing, or copying of this message is strictly prohibited**.* >> >> >> >> On Mon, Feb 26, 2018 at 5:43 PM, Sungchul Ji <s...@pharmacy.rutgers.edu> >> wrote: >> >> Hi FISers, >> >> >> >> I am not sure whether I am a Procrustes (*bed*) or a Peirce ( >> *triadomaniac*), but I cannot help but seeing an ITR (irreducible >> Triadic Relation) among Text, Context and Meaning, as depicted in* >> Figure 1*. >> >> >> >> >> * f >> g* >> >> *Context* --------> *Text * >> ---------> *Meaning* >> >> | >> ^ >> >> | >> | >> | >> | >> >> >> |_________________________| >> >> * >> h* >> >> >> >> “The meaning of a text is irreducibly dependent on its context.” >> >> >> >> “Text, context, and meaning are irreducibly triadic.” The “TCM >> principle” (?) >> >> >> >> *Figure 1.* The Procrustean bed, the Peircean triadomaniac, or both ? >> >> *f* = Sign production; *g = *Sign interpretation; *h = *Correlation >> or information flow. >> >> >> >> According to this 'Peircean/Procrustesian' diagram, both what Terry said >> and what Xueshan said may be valid. Although their thinking must have been >> irreducibly triadic (*if Peirce is right*), Terry may have focused on >> (or prescinded) Steps *f* and *h*, while Xueshan prescinded Steps *g* >> and *h,* although he did indicate that his discussion was limited to the >> context of human information and human meaning (i.e., Step f). Or maybe >> there are many other interpretations possible, depending on the interpreter >> of the posts under discussion and the ITR diagram. >> >> >> >> There are an infinite number of examples of algebraic operations: 2+3 = >> 5, 3 - 1 = 2, 20 x 45 = 900, etc., etc. >> >> If I say "2 + 3 = 5", someone may say, but you missed "20 x 45 = 900". >> In other words, no matter what specific algebraic operation I may come up >> with, my opponent can always succeed in coming up with an example I >> missed. The only solution to such an end-less debate would be to discover >> the axioms of algebra, at which level, there cannot be any debate. When I >> took an abstract algebra course as an undergraduate at the University of >> Minnesota, Duluth, in 1962-5, I could not believe that underlying all the >> complicated algebraic calculations possible, there are only 5 axioms ( >> https://www.quora.com/What-is-the-difference-between-the-5- >> basic-axioms-of-algebra). >> >> >> >> So can it be that there are the axioms (either symbolic, diagrammatic, >> or both) of information science waiting to be discovered, which will end >> all the heated debates on information, meaning, data, etc. ? >> >> >> >> All the best. >> >> >> >> Sung >> ------------------------------ >> >> *From:* Fis <fis-boun...@listas.unizar.es> on behalf of Terrence W. >> DEACON <dea...@berkeley.edu> >> *Sent:* Monday, February 26, 2018 1:13 PM >> *To:* Xueshan Yan >> *Cc:* FIS Group >> *Subject:* Re: [Fis] A Paradox >> >> >> >> It is so easy to get into a muddle mixing technical uses of a term with >> colloquial uses, and add a dash of philosophy and discipline-specific >> terminology and it becomes mental quicksand. Terms like 'information' and >> 'meaning" easily lead us into these sorts of confusions because they have >> so many context-sensitive and pardigm-specific uses. This is well exhibited >> in these FIS discusions, and is a common problem in many interdisciplinary >> discussions. I have regularly requested that contributors to FIS try to >> label which paradigm they are using to define their use of the term >> "information' in these posts, but sometimes, like fish unaware that they >> are in water, one forgets that there can be alternative paradigms (such as >> the one Søren suggests). >> >> >> >> So to try and avoid overly technical usage can you be specific about what >> you intend to denote with these terms. >> >> E.g. for the term "information" are you referring to statisitica features >> intrinsic to the character string with respect to possible alternatives, or >> what an interpreter might infer that this English sentence refers to, or >> whether this reference carries use value or special significance for such >> an interpreter? >> >> And e.g. for the term 'meaning' are you referring to what a semantician >> would consider its underlying lexical structure, or whether the sentence >> makes any sense, or refers to anything in the world, or how it might impact >> some reader? >> >> Depending how you specify your uses your paradox will become irresolvable >> or dissolve. >> >> >> >> — Terry >> >> >> >> On Mon, Feb 26, 2018 at 1:47 AM, Xueshan Yan <y...@pku.edu.cn> wrote: >> >> Dear colleagues, >> >> In my teaching career of Information Science, I was often puzzled by the >> following inference, I call it *Paradox of Meaning and Information* or >> *Armenia >> Paradox*. In order not to produce unnecessary ambiguity, I state it >> below and strictly limit our discussion within the human context. >> >> >> >> Suppose an earthquake occurred in Armenia last night and all of the main >> media of the world have given the report about it. On the second day, two >> students A and B are putting forward a dialogue facing the newspaper >> headline “*Earthquake Occurred in Armenia Last Night*”: >> >> Q: What is the *MEANING* contained in this sentence? >> >> A: An earthquake occurred in Armenia last night. >> >> Q: What is the *INFORMATION* contained in this sentence? >> >> A: An earthquake occurred in Armenia last night. >> >> Thus we come to the conclusion that *MEANING is equal to INFORMATION*, >> or strictly speaking, human meaning is equal to human information. In >> Linguistics, the study of human meaning is called Human Semantics; In >> Information Science, the study of human information is called Human >> Informatics. >> >> Historically, Human Linguistics has two definitions: 1, It is the study >> of human language; 2, It, also called Anthropological Linguistics or >> Linguistic Anthropology, is the historical and cultural study of a human >> language. Without loss of generality, we only adopt the first definitions >> here, so we regard Human Linguistics and Linguistics as the same. >> >> Due to Human Semantics is one of the disciplines of Linguistics and its >> main task is to deal with the human meaning, and Human Informatics is one >> of the disciplines of Information Science and its main task is to deal with >> the human information; Due to human meaning is equal to human information, >> thus we have the following corollary: >> >> A: *Human Informatics is a subfield of Human Linguistics*. >> >> According to the definition of general linguists, language is a vehicle >> for transmitting information, therefore, Linguistics is a branch of Human >> Informatics, so we have another corollary: >> >> B: *Human Linguistics is a subfield of Human Informatics*. >> >> Apparently, A and B are contradictory or logically unacceptable. It is a >> paradox in Information Science and Linguistics. In most cases, a settlement >> about the related paradox could lead to some important discoveries in a >> subject, but how should we understand this paradox? >> >> >> >> Best wishes, >> >> Xueshan >> >> >> _______________________________________________ >> Fis mailing list >> Fis@listas.unizar.es >> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis >> <https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Flistas.unizar.es%2Fcgi-bin%2Fmailman%2Flistinfo%2Ffis&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cdafadeb387ea48d49e8308d57d44af49%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636552656347721416&sdata=9iZiY5RL9vuquc0n7Gr111RwX0AIk9dFuw0ow3HOGMA%3D&reserved=0> >> >> >> >> >> >> -- >> >> Professor Terrence W. Deacon >> University of California, Berkeley >> >> >> _______________________________________________ >> Fis mailing list >> Fis@listas.unizar.es >> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis >> >> >> >> >> _______________________________________________ >> Fis mailing list >> Fis@listas.unizar.es >> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis >> >> > > > -- > Dr. Mark William Johnson > Institute of Learning and Teaching > Faculty of Health and Life Sciences > University of Liverpool > > Phone: 07786 064505 > Email: johnsonm...@gmail.com > Blog: http://dailyimprovisation.blogspot.com > > -- Dr. Mark William Johnson Institute of Learning and Teaching Faculty of Health and Life Sciences University of Liverpool Phone: 07786 064505 Email: johnsonm...@gmail.com Blog: http://dailyimprovisation.blogspot.com
_______________________________________________ Fis mailing list Fis@listas.unizar.es http://listas.unizar.es/cgi-bin/mailman/listinfo/fis