Mark Waser wrote:
So the real question is "what is the minimal amount of
intelligence needed for a system to self-engineer
improvments to itself?"

Some folks might argue that humans are just below that
threshold.

Humans are only below the threshold because our internal systems are so convoluted and difficult to change.

And because we lack the cultural knowledge of a theory of intelligence. But are probably quite capable of comprehending one.

--
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=48595571-b0508a

Reply via email to