Would it be Friendly to turn you into computronium if your memories were
preserved and the newfound computational power was used to make you immortal in a a simulated world of your choosing, for example, one without suffering, or where you had a magic genie or super powers or enhanced intelligence, or
maybe a world indistinguishable from the one you are in now?

That's easy. It would *NOT* be Friendly if I have a goal that I not be turned into computronium even if <your clause> (which I hereby state that I do)

Uplifting a dog, if it results in a happier dog, is probably Friendly because the dog doesn't have an explicit or derivable goal to not be uplifted.

BUT - Uplifting a human who emphatically does wish not to be uplifted is absolutely Unfriendly.

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to