Agreed, but the Turing test is annoying from the perspective of a system
like Novamente that explicitly does NOT try to be humanlike...

I don't know whether the Turing test or the University of Phoenix test
will wind up being harder for a semi-mature Novamente system...

ben

On 4/25/07, J. Storrs Hall, PhD. <[EMAIL PROTECTED]> wrote:

On Tuesday 24 April 2007 18:06, Eliezer S. Yudkowsky wrote:

> Aside from that, it [the U. of Phoenix test] sounds fair enough to me,
> and unlike the Turing Test it might not require strongly superhuman
> intelligence.

The Turing Test doesn't require superhuman intelligence, strong or
otherwise.
I could pass it, for example. It would require abilities beyond what could
be
called legitimately intelligent, given that the intelligence was
implemented
in such a way as to be optimized to the capabilities of a serial von
Neumann
computer instead of an evolved human brain, since it would have to hide
some
of its abilities and pretend feelings and so forth.

On the other hand, an upload would pass the Turing test effortlessly, and
would be about as close to a definition of an exactly human-equivalent AI
as
you're going to get. An AI that was the product of a Kurzweil-style
development effort where the functional structure of the brain was copied
in
general, but not one specific individual, could pass it as well without
any
more effort than an actor or spy uses in assuming a fictitious identity.

Josh

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to