> On 6 Oct 2015, at 10:49 AM, Bruce Kellett <bhkell...@optusnet.com.au> wrote: > >> On 6/10/2015 9:54 am, Brent Meeker wrote: >>> On 10/5/2015 3:05 PM, Bruce Kellett wrote: >>> This, of course, is the heart of our disagreement. Your identity is a lot >>> more than your thoughts and feelings because those thoughts and feelings >>> only have meaning in a context. And it is your physical body and immediate >>> surrounds that provide that context. You might be right about uploading if >>> the uploading is into an environment that is not too dissimilar from the >>> current context of your body. However, if your sensory inputs change to any >>> marked extent, you will certainly be aware that something has happened. If >>> the changes are too drastic and inexplicable, you will quite probably go >>> mad. >>> >>> I agree that the hardware of your brain, per se, is not important. But only >>> if one form of hardware simulates the other, essentially exactly. And if >>> the wider environment is largely reproduced. Consciousness supervenes on >>> the physical brain, and replacing the hardware does not alter this fact -- >>> your consciousness still supervenes on the physical substrate. It is not >>> independent of it as you wish to maintain. >> >> It's not clear to me who is arguing for what. Stathis may think that >> consciousness is independent of it's physical substrate, but I don't see >> that he's arguing that here. He's arguing that there can be more that one >> instance of "the same" consciousness. But it's not clear what is >> meant by "the same". Does one think of one's own consciousness as being the >> same as it was a second ago? an hour? a year? twenty years? I think >> there must be degrees of "sameness". Similarly, the degree will depend on >> the environmental context and interaction. If you became completely >> immobilized I think it would change your consciousness. Stephen Hawking is >> quite different than he was 50yrs ago. If you had a chip implanted that >> allowed you perceive the whole EM spectrum, including polarization, it might >> well change your consciousness. Drugs and accidents change people's >> personality and so, by inference, their consciousness. So does just plain >> learning. > > I agree. The argument has become a little unclear. As I understand Stathis's > position, he is arguing that since consciousness is a computation, any > physical instantiation of that computation lead to the same consciousness -- > the same person in fact. My objection was really that such an idea makes no > sense if you are considering instantiations of that computation in different > universes, or times and locations outside out light cone. That is for two > reasons -- first: there is no proof that any such 'copies' of our brain > activity actually exist; and second, even if they exist you can never > know that they exist, when or where. In addition, even if they do exist, they > can have no effect on you here and now -- they are outside the light cone, > after all.
And you haven't explained why these things should make any subjective difference. > Given these considerations, I would argue that since we cannot know they > exist, and they can have no effect even if they do exist, we can simply > ignore the possibility. It can make no difference to our understanding of > anything, one way or the other. > > I think the basic problem arises from an attempt to reify Bruno's > computational theory. Bruno's idea seems to be that our consciousness is > essentially a particular computation, and that particular computation will > exist many times (probably an infinite number of times) in arithmetic. So our > particular consciousness, and the environment in which it is found, is made > up of the statistics of the computations going through that conscious state. > But an essential element of this is that these computations exist only in > arithmetic -- they do not exist in a physical world -- they are not physical > operations of a physical "computer". So the attempt to identify these many > computations running through my consciousness with the existence of multiple > copies of me in the level one multiverse is simply a confusion of categories > -- a confusion of the (Platonic) arithmetic level with the (real-world) > physical level. Not unnaturally, this confusion leads to nonsense, such as > the idea that one's consciousness might continue in another universe if > something goes wrong in this universe. > > The only way you can copy your consciousness, if that is indeed possible, is > to gather the information and make a copy using standard physical processes. > There is no magical "dual" fact about consciousness such that it exists > without that substrate. Consciousness supervenes on the physical brain: if > you have two physical brains, you have two consciousnesses. > > Bruce > > >> Is the question really about "Can we achieve immortality by copying to >> different substrates?" As Bruce points out we would only preserve our >> "self" exactly up to the last copy event, since we would have diverged from >> there. It's like making a backup on your computer, it doesn't mean that >> nothing's lost when it crashes. >> >> Brent > > -- > You received this message because you are subscribed to the Google Groups > "Everything List" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to everything-list+unsubscr...@googlegroups.com. > To post to this group, send email to everything-list@googlegroups.com. > Visit this group at http://groups.google.com/group/everything-list. > For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.