Matt,

You're missing the point. Your questions are regarding interpretations as to whether or not certain conditions are equivalent to my statements, not challenges to my statements.

----- Original Message ----- From: "Matt Mahoney" <[EMAIL PROTECTED]>
To: <[email protected]>
Sent: Tuesday, October 02, 2007 7:12 PM
Subject: **SPAM** Re: [agi] Religion-free technical content


--- Mark Waser <[EMAIL PROTECTED]> wrote:

> Do you really think you can show an example of a true moral universal?

Thou shalt not destroy the universe.
Thou shalt not kill every living and/or sentient being including yourself.
Thou shalt not kill every living and/or sentient except yourself.

Suppose that the human race was destructively uploaded. Is this the same as
killing the human race?  What if the technology was not perfect and only X
percent of your memories could be preserved? What value of X is the threshold
for killing vs. saving the human race?

Would setting the birth rate to zero be the same as killing every living
thing?

Would turning the Earth into computronium be the same as destroying it? What
about global warming?  What physical changes count as "destroying" it?

Who makes these decisions, you or the more intelligent AGI? Suppose a godlike AGI decides that the universe is a simulation. Therefore there is no need to
preserve your memory by uploading because the simulation can always be run
again to recover your memories.  Do you go along?



-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=49169046-1a7eff

Reply via email to