Matt,

What a search engine lacks, to have human-level AGI, is not
human erroneousness, but rather human inventiveness...

A search engine is not going to learn to play a newly invented
game, nor make up a new game ... nor prove a theorem, nor
discover a new kind of astronomical object ... nor, say, figure
out a new sex move that is only effective in really humid
climates ;-)

In short, search is a narrow domain, so that efficiency in search
is not really human-generality AGI .. nor anywhere near...

-- Ben G


On 6/2/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:


--- Benjamin Goertzel <[EMAIL PROTECTED]> wrote:

> I recently visited Google Research and gave a talk there --
> and tried my best to estimate the odds that Google has
> a secret AGI project going on ;-)
>
> My best current guess is that they do not ... and my reasons
> why and some associated thoughts may be found at
>
> http://www.goertzel.org/blog/blog.htm
>
> -- Ben G

I agree.  There is no economic incentive to duplicate all the human
weaknesses
needed to pass the Turing test, things like limited memory, slow response
time, spelling and grammar mistakes, ignorance, dishonesty, cultural
biases,
etc.  But there is a big incentive to duplicate human strengths, so when
you
search for "funny videos", it should not have to rely solely on the
opinions
of others.

Or is this also AGI?


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to