On Mon, Dec 29, 2014 at 8:17 PM, Piaget Modeler via AGI <[email protected]> 
wrote:
>
> But I would submit that copy of you is not you.  A clone of you is not you, 
> it is a clone.
> It doesn't have the same identity.  Neither does a program that simulates 
> your mind.
> (In my humble opinion, of course.)

It is normal to feel that way. An exact copy of you would also claim
to be you. Most uploading proposals avoid this issue by killing the
original. For example, Hayworth proposes slicing up your brain,
scanning it at 5 nm resolution to reconstruct the connectome, and
using this information to program a robot. But he estimates this
technology is about 100 years away.
http://brainpreservation.org/content/killed-bad-philosophy

I believe that non-destructive uploading is closer, because it is
easier to collect the needed information just by watching and
interacting with a person for a year or so. This raises the copying
issue, of course. Nobody is going to believe that the copy is them.
The solution is to wait for the biological body to die, and then it is
no longer an issue. Even if you believe that souls exist, so will your
copy. If the upload is done right, nobody will see any difference.

It still requires solving hard problems in AI and robotics, but we are
getting there.

-- 
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to