Shane Legg wrote:
http://www.youtube.com/watch?v=WGoi1MSGu64
Which got me thinking. It seems reasonable to think that killing a
human is worse than killing a mouse because a human is more
intelligent/complex/conscious/...etc...(use what ever measure you
prefer) than a mouse.
So, would killing a super intelligent machine (assuming it was possible)
be worse than killing a human?
If a machine was more intelligent/complex/conscious/...etc... than
all of humanity combined, would killing it be worse than killing all of
humanity?
That would depend on your values obviously. Unless there is some
objective value hierarchy the question has no objective answer. From
some more cosmic and perhaps dissociated perspective I would consider
the demise of a vastly greater intelligence worse than the demise of a
numerically more numerous far less intelligent group of beings. From
the perspective of being one of those lesser beings I would hope I would
fight like hell to preserve myself and my fellow beings even if it might
doom a far greater intelligence. Hmm. Why would I hope that? What
moral program is running?
- samantha
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8