Matt Mahoney wrote:
--- Richard Loosemore <[EMAIL PROTECTED]> wrote:
Friendliness, briefly, is a situation in which the motivations of the
AGI are locked into a state of empathy with the human race as a whole.
Which is fine as long as there is a sharp line dividing human from non-human.
When that line goes away, the millions of soft constraints (which both
Richard's and my design provide for) will no longer give an answer.
This is not an argument I have seen before.
It is not coherent in the context of the proposal I have made on this
subject, for the following reason.
Once built, the AGIs would freeze the meaning of "human empathy" in such
a way that there could be no signiicant departure from that standard.
By definition that dividing line would make no difference whatsoever.
Richard Loosemore
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com