If it's all so predictable, why don't you keep that to yourselves.

On 11/6/07, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:
>
> Monika Krishan wrote:
> >
> > 2. Would it be a worthwhile exercise to explore what Human General
> > Intelligence, in it's present state, is capable of ?
>
> Nah.
>
> --
> Eliezer S. Yudkowsky                          http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>
> -----
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;
>

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=61611243-abeffa

Reply via email to