> On 4 May 2019, at 16:58, [email protected] wrote: > > > It seems people will remain in the delusion that software or programming in a > conventional computer device - even with many processors - will achieve > consciousness. Searle's Chinese Room argument still does apply here, as > anyone should clearly be able to see.
Not at all. Robert Searle argument confuse the activity of a program, with the activity of the program of a program emulating the first program. I can simulate Einstein brain, but that does not make me into Einstein. It just gives me the opportunity to discuss with Einstein. > > One can wave the magic word "cybernetic" around all one wants, but it is > clearly not useful. > > There are lots of delusions in the world: Ghosts, spirits, gods, and the > "cybernetic" one above is among them. You talk a bit like if you knew the truth. But your theory is not enough clear so that I can see a theory in the usual term of the word. Bruno > > > @pphilipthrift > > On Saturday, May 4, 2019 at 9:42:40 AM UTC-5, Terren Suydam wrote: > I'm beginning to suspect that you're a chatbot... a pretty good one - the > best I've seen, even. Your responses are syntactically correct and seemingly > relevant semantically, but whenever I or anyone else tries to pin you down > and get you to articulate specifics, your response is inevitably to quote > some article or another. Getting closer to passing the Turing Test - give > your creator my respect. > > On Sat, May 4, 2019 at 10:15 AM <[email protected] <javascript:>> wrote: > > I understand basically what your idea is, but "cybernetic dynamics" reminds > me of Norbert Weiner's subject of cybernetics, something I read about decades > ago: > > https://en.wikipedia.org/wiki/Cybernetics:_Or_Control_and_Communication_in_the_Animal_and_the_Machine > > <https://en.wikipedia.org/wiki/Cybernetics:_Or_Control_and_Communication_in_the_Animal_and_the_Machine> > > One should be able to replace every neural+glial cell with a synthetic one, > but the technology has to advance: > > > https://neo.life/2018/05/the-birth-of-wetware/ > <https://neo.life/2018/05/the-birth-of-wetware/> > > ... > > Pink juice > > Koniku’s chemical sensor is still in development, so what Agabi and Sadrian > show me is likely to continue evolving for some time. On the outside, it > sports a globular, gray-green shell with a vaguely alien look, about eight > inches wide. Inside, metal architecture supports a silicon chip with spidery > wires converging in the center, where networked neurons sit inside a clear > bubble made of a biocompatible polymer. > > When a client tells Koniku what substance it wants to sense, the company > identifies cellular receptors that would ordinarily bind to that substance. > Then it creates neurons that have those receptors. To do that, it uses > gene-editing technology to tweak the DNA of neuron precursors. Koniku obtains > those from a supplier, which manipulates skin or blood cells from mice into > blank-slate cells known as induced pluripotent stem cells. > > Once Koniku has nurtured these engineered precursors into living neurons, > they could, in theory, smell odors like a drug-sniffing dog might. Or they > could detect any number of substances that have corresponding receptors. Some > receptors are more sensitive and narrowly tuned to attach to one substance. > Others are, as Agabi puts it, more “promiscuous,” accepting an entire class > of chemicals, like nitrates. The Koniku Kore contains neurons with both types > of receptors. > > After they’ve created their mix of customized neurons, Agabi and his > colleagues use the Death Star laser to build a polymer structure for the > neurons to sit on. Then they place the cells on that structure and wait for > them to begin to network together among a set of mushroom-shaped electrodes. > Ultimately, a few “reporter” neurons will serve as the essential > neuron-silicon connection. This means they are both connected to the neuron > network and “plugged in” to the chip using the natural process of > endocytosis, in which a cell gradually engulfs foreign matter. Agabi says > Koniku has developed a special DNA coating for its electrodes. When a neuron > tries to engulf the DNA, it creates a seal that will later let the electrode > pick up electrical signals the neuron produces when its receptors bind to a > given chemical or class of chemicals. > > Almost all of this technology was around before Koniku, though not exactly in > this arrangement. Perhaps the newest element here is what Agabi calls “pink > juice.” The usual life span of a neuron in a lab is counted in days or weeks, > but Koniku’s neurons can survive for up to two months. That’s because they’re > bathed in pink juice, which feeds them and keeps them alive. > > At first, Agabi won’t tell me the exact recipe beyond saying that they’re a > mix of “vitamins, minerals, and sugars.” But I piece some of it together by > talking to Thomas DeMarse, a neuroscientist at the University of North > Carolina. > > Biology is technology, Agabi says. Everything else is a simulation > > DeMarse spent time in the spotlight in the early 2000s for his research > teaching rat neurons in a dish to fly a virtual plane by connecting them to > flight simulator software. He also did groundbreaking research on neuron > survival. He points out that there are a number of similar “juices” already > on the market, with names like BrainPhys and Neurobasal. Those pink juices > get their color from a substance called phenol red, which indicates the > liquid’s pH level. They also contain a carbonate buffer that helps maintain > acidity and simulates conditions in the brain. Using similar materials, > DeMarse was able to keep neurons alive on a desk for two years. They would > have lived longer, he says, but during that time he moved from Caltech to > Georgia Tech, and the plates started to leak en route. > > Later, when I ask Agabi if he’ll at least tell me whether his pink juice > contains phenol red and a carbonate buffer, he confirms the first and denies > the second. Academic groups may have needed the carbonate buffer to simulate > the brain, but unlike those neuroscience labs, Koniku is unconcerned with > mimicking the brain, Agabi says. “The power of the neuron comes from the > computational density — as long as we maintain that, we can change everything > else.” > > With the help of Koniku’s pink juice and a new automated pump system that > will be incorporated into each sensor, Agabi expects to eventually reach > DeMarse’s record for neuron longevity. Even then, his customers would have to > swap out their Koniku equipment every two years, but no one has requested > products with greater neuron longevity — and therefore, Agabi says, it has > not been a development priority. With the technology at hand, he says, he > could develop a Koniku Kore that would last five years, were a customer to > require it. > > Improving on evolution > “To me the devil is in the details here,” says DeMarse. What he means is: > before Koniku, its kind of wetware lived in academic and government labs. In > addition to DeMarse’s research, scientists at DARPA have worked for a long > time on an artificial nose to detect cancer. William Ditto, now of the > Nonlinear Artificial Intelligence Lab at North Carolina State University, > used leech neurons in a dish to carry out basic computations. Although no one > has done exactly what Koniku says it’s doing, there’s plenty to back up the > argument that someone could do it. In fact, DeMarse says he was “tickled” to > read about Koniku’s innovations. Gabriel A. Silva, director of the Center for > Engineered Natural Intelligence at the University of California, San Diego, > is also intrigued by Koniku’s potential. “I never underestimate groups like > this because they’re trailblazers,” he says. > > Still, Agabi’s colleagues in the academic world maintain some skepticism > about whether his technology can live up to his grand ambitions and radical > vision for the future of machine intelligence. > > For one thing, neurons have evolutionary baggage that might be unnecessary > for a computer. As an example, Rajesh Rao, director of the Center for Neural > Engineering at the University of Washington, points to myelin, the fatty > sheath that insulates nerve fibers and helps signals propagate in the brain. > It’s not clear, Rao says, that the optimal computer would have to mimic that > method of communication. Or consider dendrites, the branches that stretch out > from the body of a neuron. Neuroscientists aren’t sure whether dendrites > actually participate in information processing or are just wires that pass > information from cell to cell. Does moving information in a computer really > demand some version of dendrites? > > With issues like this in mind, all the scientists I spoke with for this > article said that while looking to biology for inspiration will be essential > for the development of AI, they were not entirely convinced by Agabi’s > argument that it will require biology itself. Just as planes use the same > principles of lift as birds do without feathers or hollow bones, “we can > extract the computational principles of how the brain processes information” > and use them in a manner “independent of actual implementation in biological > tissue,” Rao says. > > For example, neuromorphic chips are silicon chips designed using biological > principles, attempting to mimic some ways that the brain processes > information while leaving some of its baggage behind. Ditto, the researcher > who once made a computer out of leech neurons, is now working on a “chaotic > chip,” which constantly changes from analog to digital processing — as often > as a billion times a second — in order to solve problems more efficiently. He > argues that AI will require the plasticity and adaptive capacity of biology, > but that the biological element is optional. > > After all, coaxing neurons in a dish into computation isn’t so easy, either. > Even making sure they grow successfully is difficult; Silva remembers > struggling during graduate school with neurons that had suddenly stopped > growing, seemingly for no reason. “It turned out that the manufacturer of the > coverslips we used had changed the formulation of the glass,” he says. “That > alone was enough to make the neurons unhappy.” Even when they do grow, a > group of neurons, however well networked and organized, do not automatically > make a brain. The distance from chemical sensing to cognition is awfully > long, and the slippery nature of even the idea of cognition complicates this > question. A basic system that uses reward or punishment to teach things to > computers “is going to give you some behavior that will look intelligent,” > Rao says. But isn’t there more to cognition than that, more ingredients and > sensory inputs that help us react to, interact with, and make sense of the > world? The wetware recipe for that is far from clear. > > ... > > > @philipthrift > > > > > On Saturday, May 4, 2019 at 8:33:09 AM UTC-5, Terren Suydam wrote: > I should add that the cybernetic description of a system is entirely > functional, but the emphasis is on the holistic perspective. Functionalism > tends to be reductive, but the consciousness identified with a given > cybernetic description is the system as a whole. That's why replacing a > neuron with an artificial replacement does not change the consciousness. > > On Sat, May 4, 2019 at 9:30 AM Terren Suydam <[email protected] <>> wrote: > What I'm suggesting draws on both functionalism and identity theory. It's > functional in the sense that the constitutive aspect of cybernetics is > entirely functional. There is nothing in a cybernetic description beyond the > functional relationships between the parts of that system. It draws on > identity theory in the sense that I'm claiming that consciousness is > cybernetic dynamics. What I'm adding is the same move that panpsychism makes > - that there is something it is like to be any cybernetic system, and this > includes many more things than brains, and crucially, does not depend on a > specific substrate. > > On Sat, May 4, 2019 at 9:13 AM <[email protected] <>> wrote: > > > I must assume you have already studied (hopefully over many years) in > philosophy the difference between > > functionalism: https://plato.stanford.edu/entries/functionalism/ > <https://plato.stanford.edu/entries/functionalism/> > > and > > identity theory: https://plato.stanford.edu/entries/mind-identity/ > <https://plato.stanford.edu/entries/mind-identity/> > > A short way of expressing identity theory over functionalism is > > A simulation is not a synthesis. > > > Experiential materialism is a variant of identity theory in which > > • psychical properties, as well as physical ones, are attributed to matter, > which is the only basic substance > > so that > > • the material composition of the brain has both physical and psychical > aspects. > > @philipthrift > > > On Saturday, May 4, 2019 at 7:38:46 AM UTC-5, Terren Suydam wrote: > Maybe you could tell me what specific criticism you have rather than quoting > a wikipedia article. > > On Fri, May 3, 2019 at 7:50 PM <[email protected] <>> wrote: > > > I don't believe in the "functional equivalence" principle > > https://en.wikipedia.org/wiki/Functionalism_(philosophy_of_mind) > <https://en.wikipedia.org/wiki/Functionalism_(philosophy_of_mind)> > > as it does not capture the nature of what is needed for consciousness (as > many critics - some listed there - have pointed out). > > If I had to pick something vs. "cybernetic dynamics" it would be > "neurochemical dynamics". That seems closer to me. > > > @philipthrift > > On Friday, May 3, 2019 at 5:31:56 PM UTC-5, Terren Suydam wrote: > Then you're missing the point of the alternative I've been offering. It's not > about the matter itself, it's about the cybernetic dynamics implemented in > the matter. So I would predict that you could replace your brain neuron by > neuron with functional equivalents and your consciousness wouldn't change, so > long as the cybernetics were unchanged. > > On Fri, May 3, 2019, 6:08 PM <[email protected] <>> wrote: > > Well we know some matter has a psychical aspect: human brains. > > Unless one is a consciousness denier. > - https://www.nybooks.com/daily/2018/03/13/the-consciousness-deniers/ > <https://www.nybooks.com/daily/2018/03/13/the-consciousness-deniers/> > > @philipthrift > > > > On Friday, May 3, 2019 at 4:58:04 PM UTC-5, Terren Suydam wrote: > Panpsychism of any flavor that identifies matter with a psychic aspect is > subject to the problems I described earlier. > > It never occurred to me to google something like "theoretical psychology" > <https://www.google.com/search?q=theoretical+psychology> but there's a lot > there. How much of it is interesting, I don't know. > > I think as we flesh out the connectome, theoretical psychology will take on > more legitimacy and importance. > > > On Fri, May 3, 2019 at 5:16 PM <[email protected] <>> wrote: > > There is a whole spectrum of panpsychisms (plural) - from micropsychism to > cosmophychism: > > https://plato.stanford.edu/entries/panpsychism/ > <https://plato.stanford.edu/entries/panpsychism/> > cf. https://www.iep.utm.edu/panpsych/ <https://www.iep.utm.edu/panpsych/> > > That is not a "real science" yet is its basic problem of course. But > consciousness science in general really isn't yet either. > > One would think there would be a group of theoretical psychologists - there > is theoretical physics, chemistry, and biology, but theoretical psychology is > in a much weirder state - who would be involved. > > @philipthrift > > > On Friday, May 3, 2019 at 3:48:40 PM UTC-5, Terren Suydam wrote: > My question for panpsychists is similar to my question for Cosmin: what does > it buy you in terms of explanations or predictions? > > Just blanket-asserting that all matter is conscious doesn't tell me anything > about consciousness itself. For example, what would it mean for my > fingernails to be conscious? Does my fingernail consciousness factor in > somehow to my own experience of consciousness? If so, how? What about all > the other parts of my body, about individual cells? Does the bacteria living > in my body contribute its consciousness somehow? It quickly runs aground on > the same rocks that arguments about "soul" do - there's no principled way to > talk about it that elucidates relationships between brains, bodies, and > minds. Panpsychism does nothing to explain the effect of drugs on > consciousness, or brain damage. Like Cosmin's ideas, it's all just post-hoc > rationalization. Panpsychism is the philosophical equivalent of throwing your > hands up and saying "I dunno, I guess it's all conscious somehow!" > > What I'm suggesting posits that consciousness arises from the cybernetic > organization of a system, that what the system experiences, as a whole, is > identified with the informational-dynamics captured by that organization. > This yields explanations for the character of a given system's > consciousness... something panpsychism cannot do. > > Terren > > On Fri, May 3, 2019 at 3:57 PM <[email protected] <>> wrote: > > I see the coin made (as the ones lying on my desk right now made of metal) of > matter. > > The two sides of the coin (of matter) are physical and psychical: > > https://codicalist.wordpress.com/2019/01/22/matter-gets-psyched/ > <https://codicalist.wordpress.com/2019/01/22/matter-gets-psyched/> > > > If ὕ – the first Greek letter for “hyle”, upsilon (υ) with diacritics dasia > and oxia (U+1F55) – is used for the symbol of matter, φ (phi) for physical, + > ψ (psi) for psychical, then > > > > ὕ = φ + ψ > > (i.e., the combination of physical and psychical properties is a more > complete view of what matter is). The physical is the (quantitative) > behavioral aspect of matter – the kind that is formulated in mathematical > language in current physics, for example – whereas the psychical is the > (qualitative) experiential aspect of matter, at various levels, from brains > on down. There is no reason in principle for only φ to the considered by > science and for ψ to be ignored by science. > > > @philipthrift > > > > On Friday, May 3, 2019 at 2:10:05 PM UTC-5, Terren Suydam wrote: > I see them as two sides of the same coin - as in, you don't get one without > the other. > > On Fri, May 3, 2019 at 3:00 PM <[email protected] <>> wrote: > > > If "consciousness doesn't supervene on physical [or material] computation" > then does that mean there is realm for (A) consciousness and one for (B) > physical [or material] computation? > > Is A like some spirit or ghost that invades the domain of B? Or does B invade > A? > > @philipthrift > > > > > > > > > -- > > > -- > > > - > > -- > You received this message because you are subscribed to the Google Groups > "Everything List" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected] > <mailto:[email protected]>. > To post to this group, send email to [email protected] > <mailto:[email protected]>. > Visit this group at https://groups.google.com/group/everything-list > <https://groups.google.com/group/everything-list>. > For more options, visit https://groups.google.com/d/optout > <https://groups.google.com/d/optout>. -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/everything-list. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/5E3F7905-CC1D-49A4-BAB6-0B273BA0BD85%40ulb.ac.be. For more options, visit https://groups.google.com/d/optout.

