YKY,

 > There's nothing wrong with the "logical" argument.  What's wrong is that
> you
> > are presuming a purely declarative logic approach can work...which it can
> in
> > extremely simple situations, where you can specify all necessary facts.
> >
> > My belief about this is that the proper solution is to have a model of
> the
> > world, and how interactions happen in it separate from the logical
> > statements.  The logical statements are then seen as focusing techniques.
> > [ ... ]
>
> The key word here is "model".  If you can reason with mental models,
> then of course you can resolve a lot of paradoxes in logic.  This
> boils down to:  how can you represent mental models?  And they seem to
> boil down further to logical statements themselves.  In other words,
> we can use logic to represent "rich" mental models.


Unfortunately, this doesn't work either in many/most real-world cases,
though it does work in more of the simplistic cases. About half of
the real-world sentences are unparsable using POS-based methods, and even
imposing NA (real genuine Natural Intelligence) to parse them leads of
wildly inaccurate semantics that are somehow understood by the listener
despite the contained inaccuracies.

Perhaps others here remember ancient history better than I do, but didn't
Roger Schank run to the end of YKY's road and publish a couple of books on
the subject, including the code to analyze and "understand" news feed
articles? As I recall, his goal was to understand as much as possible,
not by having a high percentage of accuracy, but through maximizing the
total amount being processed and discarding everything that confused his
parser.

I commented a while back that I had some of my own lectures transcribed, and
was horrified by some of the statements that I had made. However, apparently
everyone got what I meant to say, rather than what I actually said. This is
because statements are made in a context to communicate specific facts, so
the massive potential erroneous implications are simply not considered by us
mere non-electronic mortals.

In short, it appears to me that YKY's goal is unachievable in the real world
by any imaginable technology short of a full blown AGI.

Steve Richfield



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=108809214-a0d121
Powered by Listbox: http://www.listbox.com

Reply via email to