--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:

> On 09/09/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> 
> > > > Your dilemma: after you upload, does the original human them become a
> > > > p-zombie, or are there two copies of your consciousness?  Is it
> necessary
> > > to
> > > > kill the human body for your consciousness to transfer?
> > >
> > > I have the same problem in ordinary life, since the matter in my brain
> > > from a year ago has almost all dispersed into the biosphere. Even the
> > > configuration [of] matter in my current brain, and the information it
> > > represents, only approximates that of my erstwhile self. It's just
> > > convenient that my past selves naturally disintegrate, so that I don't
> > > encounter them and fight it out to see which is the "real" me. We've
> > > all been through the equivalent of destructive uploading.
> >
> > So your answer is yes?
> 
> No, it is not necessary to destroy the original. If you do destroy the
> original you have a 100% chance of ending up as the copy, while if you
> don't you have a 50% chance of ending up as the copy. It's like
> probability if the MWI of QM is correct.

No, you are thinking in the present, where there can be only one copy of a
brain.  When technology for uploading exists, you have a 100% chance of
becoming the original and a 100% chance of becoming the copy.


> >
> > So if your brain is a Turing machine in language L1 and the program is
> > recompiled to run in language L2, then the consciousness transfers?  But
> if
> > the two machines implement the same function but the process of writing
> the
> > second program is not specified, then the consciousness does not transfer
> > because it is undecidable in general to determine if two programs are
> > equivalent?
> 
> It depends on what you mean by "implements the same function". A black
> box that emulates the behaviour of a neuron and can be used to replace
> neurons one by one, as per Hans Moravec, will result in no alteration
> to consciousness (as shown in David Chalmers' "fading qualia" paper:
> http://consc.net/papers/qualia.html), so total replacement by these
> black boxes will result in no change to consciousness. It doesn't
> matter what is inside the black box, as long as it is functionally
> equivalent to the biological tissue. On the other hand...

I mean "implements the same function" in that identical inputs result in
identical outputs.  I don't insist on a 1-1 mapping of machine states as
Chalmers does.  I doubt it makes a difference, though.

Also, Chalmers argues that a machine copy of your brain must be conscious. 
But he has the same instinct to believe in consciousness as everyone else.  My
claim is broader: that either a machine can be conscious or that consciousness
does not exist.

> What is the difference between really being conscious and only
> thinking that I am conscious?

Nothing.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=39985876-d99aeb

Reply via email to