If the test is defined to refer ONLY to conversations about
a sufficiently narrow domain of objects in
a toy virtual world ... and they encode enough knowledge ... then maybe they
could brute-force past the test... after all there is not that much to
say about
a desk, a table, a lamp and a box ... or whatever the set of objects in the toy
world may be...

This is the danger of toy test environments, be they in virtual worlds or
physical robotics...

ben g

On Thu, Mar 13, 2008 at 12:35 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> Unless the details of that modified Turing Test are somehow profoundly
>  flawed, then, yes...
>
>  ben
>
>
>
>  On Thu, Mar 13, 2008 at 12:28 PM, Eric B. Ramsay <[EMAIL PROTECTED]> wrote:
>  > So Ben, based on what you are saying, you fully expect them to fail their
>  > Turing test?
>  >
>  > Eric B. Ramsay
>  >
>  >
>  > Ben Goertzel <[EMAIL PROTECTED]> wrote:
>  >  I know Selmer and his group pretty well...
>  >
>  > It is well done stuff, but it is purely hard-coded-knowledge-based
>  > logical inference --
>  > there is no real learning there...
>  >
>  > It's not so hard to get impressive-looking functionality in toy demo
>  > tasks, by hard-
>  > coding rules and using a decent logic engine
>  >
>  > Others have failed at this, so his achievement is worthwhile and means his
>  > logic
>  > engine and formalism are better than most ... but still ... IMO, this
>  > is not a very likely
>  > path to AGI ...
>  >
>  > -- Ben
>  >
>  > On Thu, Mar 13, 2008 at 10:30 AM, Ed Porter wrote:
>  > > Here is an article about RPI's attempt to pass a slightly modified 
> version
>  > > of the turning test using supercomputers to power their "Rascals" AI
>  > > algorithm.
>  > >
>  > >
>  > 
> http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=206903246&pri
>  > > ntable=true
>  > >
>  > > The one thing I didn't understand was that they said their "Rascals" AI
>  > > algorithm used a theorem proving architectures. I would assume that that
>  > > would mean it as based on binary logic, and thus would not be 
> sufficiently
>  > > flexible to model many human thought processes, which are almost 
> certainly
>  > > more neural net-like, and thus much more probabilistic.
>  > >
>  > > Does anybody have any opinions on that.
>  > >
>  > > Ed Porter
>  > >
>  > > -------------------------------------------
>  > > agi
>  > > Archives: http://www.listbox.com/member/archive/303/=now
>  > > RSS Feed: http://www.listbox.com/member/archive/rss/303/
>  > > Modify Your Subscription: http://www.listbox.com/member/?&;
>  >
>  > > Powered by Listbox: http://www.listbox.com
>  > >
>  >
>  >
>  >
>  > --
>  > Ben Goertzel, PhD
>  > CEO, Novamente LLC and Biomind LLC
>  > Director of Research, SIAI
>  > [EMAIL PROTECTED]
>  >
>  > "If men cease to believe that they will one day become gods then they
>  > will surely become worms."
>  > -- Henry Miller
>  >
>  > -------------------------------------------
>  > agi
>  > Archives: http://www.listbox.com/member/archive/303/=now
>  > RSS Feed: http://www.listbox.com/member/archive/rss/303/
>  > Modify Your Subscription: http://www.listbox.com/member/?&;
>  >
>  > Powered by Listbox: http://www.listbox.com
>  >
>  >  ________________________________
>  >
>  >  agi | Archives | Modify Your Subscription
>
>
>
>  --
>  Ben Goertzel, PhD
>  CEO, Novamente LLC and Biomind LLC
>  Director of Research, SIAI
>  [EMAIL PROTECTED]
>
>  "If men cease to believe that they will one day become gods then they
>  will surely become worms."
>  -- Henry Miller
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to