On Tue, Mar 9, 2021, 1:36 PM WriterOfMinds <[email protected]>
wrote:

>
> Matt seems to think he knows a lot about how this would feel. He thinks
> that if he connected his brain and a dog's, he would never be able to
> achieve anything more than getting the dog's senses added to his ... rather
> than perceiving the dog's complex of sensations, thoughts, and emotions as
> "the other," an alien and complete presence in his mind. But he quite
> obviously doesn't know anything, because he hasn't done the experiment.
>

I'm guessing based on experiments with electrical brain stimulation. Ben
didn't specify exactly how the connection would be made, so I made some
assumptions that I thought were reasonable.

>
> I maintain that denying the reality of one's *own* consciousness is
> irrational, insane.
>

I have the same positive reinforcement of thinking, perception, and action
as everyone else. Consciousness seems real to me. I would not have a reason
to live if I didn't have this illusion of a soul or little person in my
head that experiences the world and that I could imagine going to heaven or
a computer after I die. I just know that all the evidence says otherwise. I
can no more turn off the illusion than I can turn off pain just because I
know that pain is just certain neurons firing.

A lot of discussion on this list is due to implicit disagreement over the
definition of words like "consciousness", "intelligence", or "singularity".
Meanings also change over time. That's why we have AGI to mean what AI used
to mean. That's why computers can never exceed human level intelligence,
even though they did 70 years ago.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M703ca3e5ad2d65a4f53212f7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to