Howdy Shane,

I'll try to put my views in your format....

I think that

Extremely powerful, vastly superhuman AGI ==> outstanding Hutter test result

whereas

Human-level AGI =/=> Good Hutter test result

just as

Human =/=> Good Hutter test result

and for this reason I consider the Hutter test a deeply flawed
benchmark test to apply to systems on the path of gradually improving,
vaguely humanlike general intelligence.

Next, I think that appropriately-constructed narrow-AI systems will be
able to outperform nearly all human-level AGI systems on the Hutter
test.

I.e., in your format, this means I feel that

Good (e.g. winning) Hutter test result =/=> powerful AGI

However, I am uncertain whether

Amazingly outstanding Hutter test result ==> powerful AGI

Is that clear?

-- Ben




On 8/12/06, Shane Legg <[EMAIL PROTECTED]> wrote:

Ben,

So you think that,

Powerful AGI  ==> good Hutter test result

But you have a problem with the reverse implication,

good Hutter test result =/=> Powerful AGI

Is this correct?

Shane

 ________________________________
 To unsubscribe, change your address, or temporarily deactivate your
subscription, please go to
http://v2.listbox.com/member/[EMAIL PROTECTED]

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to