On 09/09/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:

> > > Your dilemma: after you upload, does the original human them become a
> > > p-zombie, or are there two copies of your consciousness?  Is it necessary
> > to
> > > kill the human body for your consciousness to transfer?
> >
> > I have the same problem in ordinary life, since the matter in my brain
> > from a year ago has almost all dispersed into the biosphere. Even the
> > configuration [of] matter in my current brain, and the information it
> > represents, only approximates that of my erstwhile self. It's just
> > convenient that my past selves naturally disintegrate, so that I don't
> > encounter them and fight it out to see which is the "real" me. We've
> > all been through the equivalent of destructive uploading.
>
> So your answer is yes?

No, it is not necessary to destroy the original. If you do destroy the
original you have a 100% chance of ending up as the copy, while if you
don't you have a 50% chance of ending up as the copy. It's like
probability if the MWI of QM is correct.

> > There is no guarantee that something which behaves the same way as the
> > original also has the same consciousness. However, there are good
> > arguments in support of the thesis that something which behaves the
> > same way as the original as a result of identical or isomorphic brain
> > structure also has the same consciousness as the original.
>
> So if your brain is a Turing machine in language L1 and the program is
> recompiled to run in language L2, then the consciousness transfers?  But if
> the two machines implement the same function but the process of writing the
> second program is not specified, then the consciousness does not transfer
> because it is undecidable in general to determine if two programs are
> equivalent?

It depends on what you mean by "implements the same function". A black
box that emulates the behaviour of a neuron and can be used to replace
neurons one by one, as per Hans Moravec, will result in no alteration
to consciousness (as shown in David Chalmers' "fading qualia" paper:
http://consc.net/papers/qualia.html), so total replacement by these
black boxes will result in no change to consciousness. It doesn't
matter what is inside the black box, as long as it is functionally
equivalent to the biological tissue. On the other hand...

> On the other hand, your sloppily constructed customized AGI will insist that
> it is a conscious continuation of your life, even if 90% of its memories are
> missing or wrong.  As long as the original is dead then nobody else will
> notice the difference, and others seeing your example will have happily
> discovered the path to immortality.

That could be like an actor taking my place. Admittedly it might be
difficult to tell us apart, but that is no guarantee of survival.

> Arguments based on the assumption that consciousness exists always lead to
> absurdities.  But belief in consciousness is instinctive and universal.  It
> cannot be helped.  The best I can do is accept both points of view, realize
> they are inconsistent, and leave it at that.

What is the difference between really being conscious and only
thinking that I am conscious?


-- 
Stathis Papaioannou

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=39906466-1ec335

Reply via email to