--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 08/09/07, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > I agree this is a great risk. The motivation to upload is driven by fear > of > > death and our incorrect but biologically programmed belief in > consciousness. > > The result will be the extinction of human life and its replacement with > > godlike intelligence, possibly this century. The best we can do is view > this > > as a good thing, because the alternative -- a rational approach to our own > > intelligence -- would result in extinction with no replacement. > > If my upload is deluded about its consciousness in exactly the same > way you claim I am deluded about my consciousness, that's good enough > for me.
And it will be, if the copy is exact. Your dilemma: after you upload, does the original human them become a p-zombie, or are there two copies of your consciousness? Is it necessary to kill the human body for your consciousness to transfer? What if the copy is not exact, but close enough to fool others who know you? Maybe you won't have a choice. Suppose you die before we have developed the technology to scan neurons, so family members customize an AGI in your likeness based on all of your writing, photos, and interviews with people that knew you. All it takes is 10^9 bits of information about you to pass a Turing test. As we move into the age of surveillance, this will get easier to do. I bet Yahoo knows an awful lot about me from the thousands of emails I have sent through their servers. -- Matt Mahoney, [EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604&id_secret=39888218-f25442
