Turing answered the question of whether machines could think. A lot of people don't like his answer. But nobody has a better one.
Thinking is neurons firing, performing a computation that could in principle be done by a computer, producing the same outputs for the same inputs. What most people find baffling is that thinking feels like something. It shouldn't be. Feelings are also just neurons firing. But of course it is. No computer can model itself. So it would be more surprising if it wasn't baffling. On Tue, Mar 3, 2020, 5:31 PM Alan Grimes via AGI <[email protected]> wrote: > I'm baffled. > > Back in 1978 Douglas Hofstadter wrote his masterpiece, "Godel Escher > Bach: An Eternal Golden Braid" (shortened to GEB:EGB). > > It is the one Essential Classic in the field of AI. I'd bring out my > "Read A Book " youtube link but it has the prohibited word in it. =P > > Basically Hofstadter covers the Godel Incompleteness theorem and how > formal systems are either incomplete or brittle. > > Picture the data flows of data through your CPU. You have the > instruction stream, that isn't really interesting, and you have the data > stream, both moving at a rate of tens of gigabytes a second in the > latest model processors. > > In conventional computing, text is ** encoded **. It is this encoded > form that passes through the processor. The CPU can't see the words, it > can't hear the words, it can't feel the words, it can't poet the words, > it can't sing the words, it can't H4x0r the words, and it can't > understand the words. It can only instruction the words. > > (nice illiustration on page 310, accompanying text in the "... Ant Fuge" ) > > Language is a bit sketchy here, you could call it emergance. You could > also say you are going up a few layers of abstraction, neither is correct. > > Immagine a concept that feels "emergant" but it has been carefully > engineered, and optimized. > > Imagine a concept like a higher layer of abstraction but the higher > layers are unaligned with the lower layers. > > Imagine a voice-activated NLP system that completely omitts the > speech-to-text part. > > Now I don't know how many layers of abstraction are required, or whether > the idea of a layer of abstraction is meaningful in this context. What I > am saying is that there is no possible way to create a thing that can > think because it can process words, but rather it is possible to create > a thing that can think, in complex ways, about any form of stimulus, > including words. That is where we must look for AGI. > > Ps: the computer I'm typing this on is named Tortoise, the machine in > the other room is called Achilles. ;) > > -- > Clowns feed off of funny money; > Funny money comes from the FED > so NO FED -> NO CLOWNS!!! > > Powers are not rights. > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Ta07783a765685323-M4beb988d2ad29eb046647973 Delivery options: https://agi.topicbox.com/groups/agi/subscription
