With great trepidation, I will try to keep this to computing :D It may revolve around the meaning of "uploading", but my problem with the uploading approach, is that it makes a copy. Whether a copy is the same as the real thing I feel is beyond the scope of a computing discussion in this particular sense. I assert, that I am not interested in a copy of Me (in legal style, I will use capitals for defined terms).
The next thing is the definition of Me. For the purpose of this, Me is defined as the pattern of interaction of physical processes that happens within the volume bound by my skin. I will further refine to a concept of Sensory Me, which I will define as the pattern of interaction of physical processes that happens within my nervous system. I will further refine to a concept of the Conscious Me, which I will define as the "pattern of interaction" from the definition of Sensory Me, and it is separate from the "physical processes" of the same. With the definition of Conscious Me in place, what I am interested in is preserving the Conscious Me whether in its original form (i.e. implemented on top of original physical processes, that is embodied in a human body), or over a different substrate. Side note: if you disagree with my definitions, then please don't argue the conclusions using your own definitions. I consider it axiomatic that from different definitions we'll likely arrive at something different, so no argument is to be had really. It seems to me to be possible to one by one replace various physical processes with a different type that would result in supporting the same pattern of interaction (Conscious Me). The distinction I am making, is that I am interested in continuing the existing pattern (Conscious Me), hot-swapping, so to speak, the physical processes implementing it. This is the best illustration of why I feel "uploading", which to me implies a copy, would be wrong and horrible. Because the existing pattern would then be discontinued as the uploaded pattern would be permitted to endure. More on computation... There is ample evidence, that I will sort of assume and handwave, that our Conscious Me's are capable of great flexibility and plasticity. For example, when I drive a car, my concept of "me" incorporates the machine I am operating. This effect is even more pronounced when piloting an aircraft. Or our ability to train our brains to see with our tounges<http://www.scientificamerican.com/article.cfm?id=device-lets-blind-see-with-tongues> I am very interested in the Hierarchical Temporal Memory<https://www.numenta.com/technology.html#cla-whitepaper>(the "HTM") model of how the human neocortex computes and a lot of my views about Conscious Me are informed by the HTM model. HTM proposes one algorithm, implemented on a certain physical architecture, that can give a rise to "Metaphors We Live By"<http://www.amazon.com/Metaphors-We-Live-By-ebook/dp/B006KYECYA/ref=tmm_kin_title_0>types of thinking that human beings seem to have. The reason I am very interested in dynamic objects all the way down (types of systems VPRI is building) is because I am looking at them through the lens of preserving the Conscious Me. Fully dynamic objects running on hardware seem promising in this regard. The Actor Model also helps to frame some of the things through a slightly different lens, and hence my interest in it. Both seem to allow emergent behavior for processes that may in the future support Conscious Me. Admittedly, the interface between the two physical processes remains as a subject for future research. On Tue, Apr 23, 2013 at 10:11 AM, Loup Vaillant-David <l...@loup-vaillant.fr>wrote: > On Tue, Apr 23, 2013 at 04:01:20PM +0200, Eugen Leitl wrote: > > On Fri, Apr 19, 2013 at 02:05:07PM -0500, Tristan Slominski wrote: > > > > > That alone seems to me to dismiss the concern that mind uploading > would not > > > be possible (despite that I think it's a wrong and a horrible idea > > > personally :D) > > Personally, I can think of 2 objections: > > 1. It may turn out that mind uploading doesn't actually transfer your > mind in a new environment, but actually makes a *copy* of you, > which will behave the same, but isn't actually you. From the > outside, it would make virtually no difference, but from the > inside, you wouldn't get to live in the Matrix. > > 2. There's those cool things called "privacy", and "free will" that > can get seriously compromised if anyone but a saint ever get root > access to the Matrix you live in. And we have plenty of reasons > to abuse such a system. Like: > > - Boost productivity with happy slaves. Just copy your best > slaves, and kill the rest. Or make them work 24/7 by killing > them every 8 hours, and restarting a saved state. (I got the > idea from Robin Hanson.) > > Combined with point (1), this is a killer: we will probably get > to a point where meatbags are not competitive enough to feed > themselves. So, everyone dies soon, and Earth becomes a giant > City of Ghosts. > > - Make a number of psychological experiments by simulating a giant > cube of 27*27*27 rooms with lots of traps. > > - Indulge your base instincts by inflicting the unspeakable to the > copy of your chosen victim(s). Nobody will notice anyway. > > I still think there's a potential for paradise there, but if we screw > up, it could be worse than Hell. > > Loup. > _______________________________________________ > fonc mailing list > fonc@vpri.org > http://vpri.org/mailman/listinfo/fonc >
_______________________________________________ fonc mailing list fonc@vpri.org http://vpri.org/mailman/listinfo/fonc