The key of the discussion should be transhumanism, not transmutation. Your dog 
and toaster arguments are silly. The assumption of any transferral of sense of 
emotion and consciousness (as awareness +) assumes there would be a suitable 
platform with suitable architecture to accommodate such a transfer.

As for your points on singularity, there are different uses of the term, and 
perhaps even different definitions. I think that singularity has no bearing on 
the possibilities for transferring general intelligence from human to machine, 
human to human form (a corpse), or augmenting human intelligence with 
non-biological computational architecture (in the sense of cyborg).

________________________________
From: Matt Mahoney <[email protected]>
Sent: Tuesday, 09 March 2021 18:26
To: AGI <[email protected]>
Subject: Re: [agi] Patterns of Cognition

It depends how you make the connections between brains. The sensible way would 
be to add connections gradually so you are not overwhelmed with novel 
sensations, and then only after determining that the neurons have similar 
meanings. For example, I would feel hungry when the dog is hungry, but 
different enough (dog hungry vs human hungry) that we are not fooled as to who 
needs to eat.

In that sense, it would feel to me like the dog was conscious. But it would be 
the same feeling I have now that I am conscious. I just wired my brain to 
believe the dog is conscious. I could wire my brain to believe anything I 
wanted. It's not evidence that the belief is true.

Likewise, if I connected a toaster to my nucleus accumbens so that I got some 
positive reinforcement when it made toast, it would feel like to me like the 
toaster consciously wants to make toast of it's own free will. What would that 
prove?

Anyway, I assume that as an AGI researcher, that you don't believe that the 
brain is doing anything that can't in principle be done by a computer.

Also, "after the singularity" is a logical contradiction. The singularity is 
the point where the rate of recursive self improvement goes to infinity. It is 
infinitely far into the future measured in perceptual time or in number of 
irreversible bit operations. Time would not exist "afterwards", just like there 
are no real numbers after infinity. That is, if the universe were infinite so 
that physics even allowed a singularity to happen in the first place.

On Tue, Mar 9, 2021, 1:25 AM Ben Goertzel 
<[email protected]<mailto:[email protected]>> wrote:
> So let's try it. If I randomly connect my neurons to the neurons in a dog's 
> brain, then I get a lot of novel sensations that just confuse me. After years 
> of experiments I learn their meanings. When I taste metal, it means the dog 
> is scratching it's left ear, and so on.
>
> Eventually our minds work as one. It's as if I have two bodies, one human and 
> one dog. It doesn't tell me if the dog is conscious because it feels like 
> there is only one consciousness connected to both bodies.

But I suspect the interesting part occurs between the above two
paragraphs.  In the state where it's quite as if a separate system is
feeding you sensations, yet not quite as if there is a single mind
spanning the human and dog body.   There will be an intermediate state
where you sense the dog's consciousness subjectively and
experientially, in the vein of what Martin Buber called an "I-Thou"
experience.

And this state will not be remotely so intense an I-Thou experience if
the dog is replaced with a toaster...

I don't expect you to believe this will happen, given your current
state of understanding.   But I do expect that if you survive the
Singularity, you'll look back at some point and remember this chat and
experience a nanosecond of mild amusement that silly Ben was right
about this ;)
Artificial General Intelligence List<https://agi.topicbox.com/latest> / AGI / 
see discussions<https://agi.topicbox.com/groups/agi> + 
participants<https://agi.topicbox.com/groups/agi/members> + delivery 
options<https://agi.topicbox.com/groups/agi/subscription> 
Permalink<https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M3946dead179b21641762508e>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-Mb16c19794a7fc319b3384a2a
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to