> 3.  A statement in their own words that they hereby disavow allegiance
> to any non-human god or alien entity, and that they will NOT follow the
> directives of any government led by people who would obviously fail this
> test. This statement would be included on the license.
>
>

Hmmm... don't I fail this test every time I follow the speed limit ?   ;-)

As another aside, it seems wrong to accuse Buddhists of condoning violence
because they don't like MAD (which involves stockpiling nukes) ... you could
accuse them of foolishness perhaps (though I don't necessarily agree) but
not of condoning violence

My feeling is that with such a group of intelligent and individualistic
folks as transhumanists and AI researchers are, any  "litmus test for
cognitive sanity" you come up with is gonna be quickly revealed to be full
of loopholes that lead to endless philosophical discussions... so that in
the end, such a test could only be used as a general guide, with the
ultimate cognitive-sanity-test to be made on a qualitative basis

In a small project like Novamente, we can evaluate each participant
individually to assess their thought process and background.  In a larger
project like OpenCog, there is not much control over who gets involved, but
making people sign a form promising to be rational and cognitively sane
wouldn't seem to help much, as obviously there is nothing forcing people to
be honest...

ben g



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com

Reply via email to