On 03/06/2008 02:18 PM,, Mark Waser wrote:
I wonder if this is a substantive difference with Eliezer's position
though, since one might argue that 'humanity' means 'the
[sufficiently intelligent and sufficiently ...] thinking being'
rather than 'homo sapiens sapiens', and the former would of course
include SAIs and intelligent alien beings.
Eli is quite clear that AGI's must act in a Friendly fashion but we
can't expect humans to do so. To me, this is foolish since the
attractor you can create if humans are Friendly tremendously increases
our survival probability.
The point I was making was not so much about who is obligated to act
friendly but whose CEV is taken into account. You are saying all
sufficiently ... beings, while Eliezer says humanity. But does Eliezer
say 'humanity' because that humanity is *us* and we care about the CEV
of our species (and its sub-species and descendants...) or 'humanity'
because we are the only sufficiently ... beings that we are presently
aware of (and so humanity would include any other sufficiently ... being
that we eventually discover).
It just occurred to me though that it doesn't really matter whether it
is the CEV of homo sapiens sapiens or the CEV of some alien race or that
of AIs, since you are arguing that they are the same, since there's
nowhere to go beyond a point except towards the attractor.
joseph
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com