It depends how you make the connections between brains. The sensible way
would be to add connections gradually so you are not overwhelmed with novel
sensations, and then only after determining that the neurons have similar
meanings. For example, I would feel hungry when the dog is hungry, but
different enough (dog hungry vs human hungry) that we are not fooled as to
who needs to eat.

In that sense, it would feel to me like the dog was conscious. But it would
be the same feeling I have now that I am conscious. I just wired my brain
to believe the dog is conscious. I could wire my brain to believe anything
I wanted. It's not evidence that the belief is true.

Likewise, if I connected a toaster to my nucleus accumbens so that I got
some positive reinforcement when it made toast, it would feel like to me
like the toaster consciously wants to make toast of it's own free will.
What would that prove?

Anyway, I assume that as an AGI researcher, that you don't believe that the
brain is doing anything that can't in principle be done by a computer.

Also, "after the singularity" is a logical contradiction. The singularity
is the point where the rate of recursive self improvement goes to infinity.
It is infinitely far into the future measured in perceptual time or in
number of irreversible bit operations. Time would not exist "afterwards",
just like there are no real numbers after infinity. That is, if the
universe were infinite so that physics even allowed a singularity to happen
in the first place.

On Tue, Mar 9, 2021, 1:25 AM Ben Goertzel <[email protected]> wrote:

> > So let's try it. If I randomly connect my neurons to the neurons in a
> dog's brain, then I get a lot of novel sensations that just confuse me.
> After years of experiments I learn their meanings. When I taste metal, it
> means the dog is scratching it's left ear, and so on.
> >
> > Eventually our minds work as one. It's as if I have two bodies, one
> human and one dog. It doesn't tell me if the dog is conscious because it
> feels like there is only one consciousness connected to both bodies.
>
> But I suspect the interesting part occurs between the above two
> paragraphs.  In the state where it's quite as if a separate system is
> feeding you sensations, yet not quite as if there is a single mind
> spanning the human and dog body.   There will be an intermediate state
> where you sense the dog's consciousness subjectively and
> experientially, in the vein of what Martin Buber called an "I-Thou"
> experience.
>
> And this state will not be remotely so intense an I-Thou experience if
> the dog is replaced with a toaster...
>
> I don't expect you to believe this will happen, given your current
> state of understanding.   But I do expect that if you survive the
> Singularity, you'll look back at some point and remember this chat and
> experience a nanosecond of mild amusement that silly Ben was right
> about this ;)
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M3946dead179b21641762508e
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to