I would agree it is just a process, however I came up with that  
thought experiment because a friend of mine expressed fear over the  
idea of stepping into a teleporter.  He was weary that his  
consciousness would be destroyed and the other copy created elsewhere  
would "not be me".

I began with telling him it doesn't matter so long as a copy of him  
existed somewhere that thought it was him, which is a lot like your  
"it's a process" argument, but he was not satisfied.  I think my  
argument helps show that even if it is a process that subjectively,  
conciousness continues even if in some place or time it is stopped.   
Therefore he ought not fear a star trek style transporter.


On Nov 1, 2008, at 4:49 PM, Brent Meeker <[EMAIL PROTECTED]>  

> If you stop thinking of consciousness as a "thing" that goes here or
> there or is duplicated or destroyed and just regard it as a process,
> these conundrums disappear.
> Brent
> Jason Resch wrote:
>> I've thought of an interesting modification to the original UDA
>> argument which would suggest that one's consciousness is at both
>> locations simultaneously.
>> Since the UDA accepts digital mechanism as its first premise, then it
>> is possible to instantiate a consciousness within a computer.
>> Therefore instead of a physical teleportation from Brussels to
>> Washington and Moscow instead we will have a digital transfer.  This
>> will allow the experimenter to have complete control over the input
>> each mind receives and guarantee identical content of experience.
>> A volunteer in Brussels has her brain frozen and scanned at the
>> necessary substitution level and the results are loaded into a
>> computer with the appropriate simulation software that can accurately
>> model her brain's functions, therefore from her perspective, her
>> consciousness continues onward from the time her brain was frozen.
>> To implement the teleportation, the simulation in the computer in
>> Brussels is paused, and a snapshot of the current state is sent over
>> the Internet to two computers, one in Washington and the other in
>> Moscow, each of these computers has the same simulation software and
>> upon receipt, resume the simulation of the brain where it left off in
>> Brussels.
>> The question is: if the sensory input is pre-fabricated and identical
>> in both computers, are there two minds, or simply two implementations
>> of the same mind?  If you believe there are two minds, consider the
>> following additional steps.
>> Since it was established that the experimenter can "teleport" minds  
>> by
>> pausing a simulation, sending their content over the network, and
>> resuming it elsewhere, then what happens if the experimenter wants to
>> teleport the Washington mind to Moscow, and the Moscow mind to
>> Washington?  Assume that both computers were preset to run the
>> simulation for X number of CPU instructions before pausing the
>> simulation and transferring the state, such that the states are
>> exactly the same when each is sent.  Further assume that the  
>> harddrive
>> space on the computers is limited, so as they receive the brain  
>> state,
>> they overwrite their original save.
>> During this procedure, the computers in Washington and Moscow each
>> receive the other's brain state, however, it is exactly the same as
>> the one they already had.  Therefore the overwriting is a no-op.
>> After the transfer is complete, each computer resumes the simulation.
>> Now is Moscow's mind on the Washington computer?  If so how did a
>> no-op (overwriting the file with the same bits) accomplish the
>> teleportation, if not, what makes the teleportation fail?
>> What happens in the case where the Washington and Moscow computer
>> shutdown for some period of time (5 minutes for example) and then  
>> the Moscow computer is turned back on.  Did a "virtual" teleportation
>> occur between Washington and Moscow to allow the consciousness that
>> was in Washington to continue?  If not, then would a physical  
>> transfer
>> of the data from Washington to Moscow have saved its consciousness,
>> and if so, what happened to the Moscow consciousness?
>> The above thought experiments led me to conclude that both computers
>> implement the same mind and are the same mind, despite
>> having different explanations.  Turning off one of the computers in
>> either Washington or Moscow, therefore, does not end the
>> consciousness.  Per the conclusions put forth in the UDA, the
>> volunteer in Brussels would say she has a 1/2 chance of ending up in
>> the Washington computer and 1/2 chance of ending up in the Moscow
>> computer.  Therefore, if you told her "15 minutes after the
>> teleportation the computer in Washington will be shut off forever"  
>> she
>> should expect a 1/2 chance of dying.  This seems to be a
>> contradiction, as there is a "virtual" teleportation from Washington
>> to Moscow which saves the consciousness in Washington from oblivion.
>> So her chances of death are 0, not 1/2, which is only explainable if
>> we assume that her mind is subjectively in both places after the  
>> first
>> teleport from Brussels, and so long as a simulation of her mind  
>> exists
>> somewhere she will never die.
>> Jason
>> On Fri, Oct 31, 2008 at 12:36 PM, Bruno Marchal <[EMAIL PROTECTED]
>> <mailto:[EMAIL PROTECTED]>> wrote:
>>    On 30 Oct 2008, at 23:58, Brent Meeker wrote:
>>> Kory Heath wrote:
>>>> On Oct 30, 2008, at 10:06 AM, Bruno Marchal wrote:
>>>>> But ok, perhaps I have make some progress lately, and I will
>>    answer
>>>>> that the probability remains invariant for that too. The
>>    probability
>>>>> remains equal to 1/2 in the imperfect duplication (assuming 1/2 is
>>>>> the perfect one).
>>>>> But of course you have to accept that if a simple teleportation is
>>>>> done imperfectly (without duplication), but without killing
>>    you, the
>>>>> probability of surviving is one (despite you get blind, deaf,
>>>>> amnesic and paralytic, for example).
>>>> This is the position I was arguing against in my earlier post.
>>    Let's
>>>> stick with simple teleportation, without duplication. If the
>>    data is
>>>> scrambled so much that the thing that ends up on the other side is
>>>> just a puddle of goo, then my probability of surviving the
>>>> teleportation is 0%. It's functionally equivalent to just
>>    killing me
>>>> at the first teleporter and not sending any data over. (Do you
>>>> agree?)
>>>> If the probability of me surviving when an imperfect copy is
>>    made is
>>>> still 100%, then there's some point of "imperfection" at which my
>>>> chances of surviving suddenly shift from 100% to 0%. This
>>    change will
>>>> be marked by (say) the difference of a single molecule (or bit of
>>>> data, or whatever). I don't see how that can be correct.
>>>> -- Kory
>>> But there are many ways for what comes out of the teleporter to
>>> *not* be you.
>>> Most of them are "puddles of goo", but some of them are copies of
>>> Bruno or
>>> imperfect copies of me or people who never existed before.
>>> Suppose it's a copy of you as you were last year - is it 100% you.
>>> It's not
>>> 100% the you that went into the machine - but if you're the same
>>> person you were
>>> last year it's 100% you.  Of course the point is that you're not the
>>> same "you"
>>> from moment to moment in the sense of strict identity of information
>>> down to the
>>> molecular level, or even the neuron level.
>>    Yes. And if a teleporter transforms me into a copy of me as I  
>> was last
>>    year, I will say that although I have 100% survive, I suffer  
>> from an
>>    amnesia bearing on one year of experience, and indeed I will  
>> have to
>>    relearn what "I" have done and update myself accordingly.
>>    I can complain about the doctor or about the teleportation  
>> company of
>>    course, like someone who did survive a train accident, with  
>> injuries,
>>    perhaps amnesia, can complain about the railroad society (if he
>>    remembers the name).
>>    --Bruno Marchal
>>    http://iridia.ulb.ac.be/~marchal/
>>    <http://iridia.ulb.ac.be/%7Emarchal/>
> >

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to [EMAIL PROTECTED]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 

Reply via email to