On 11/7/07, Edward W. Porter <[EMAIL PROTECTED]> wrote:
> There is no reason why properly designed AGIs with world knowledge and the
> power to compute from it would have any less common sense than humans.

Ok.  Are you also going to sufficiently cripple your AGI's ability to
think rationally that they are completely comparable in skills as a
human?  With super-human skill at "common sense" and equally
superhuman rationality, will this AGI be considered mentally healthy
if the observing psychologist is not augmented to extra-super-human
reasoning?

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=62531990-bd4e7c

Reply via email to