Arthur Murray wrote:
> If Ben Goertzel and the rest of the Novamente team build up
> an AI that mathematically "comprehends" mountains of data,
> they may miss the AI boat by not creating persistent concepts
> that accrete and auto-prune over time as the basis of NLP.

No, even before the Novamente system understands natural language, it will
still have persistent concepts accreting and auto-pruning over time.  In
fact, we're right now doing some primitive testing of accretion &
auto-pruning processes.

Accretion & auto-pruning are part of a chimp's mind-brain, they're prior to
language...

> The Mentifex Mind-1.1 AI, primitive as it may be, has since 1993
> (nine years ago) gradually built up a sophisticated group of
> about 34 mind-modules now barely beginning to achieve NLP results.
>
> I enter these thoughts here not confrontationally but from a
> point-of-view that NLP is not otherwise being sufficiently
> represented among all these mathematicians and computationalists.

NLP is obviously very important.  Historically, it has often been associated
with an overly rigid "rule-based" approach to AI, which is perhaps why it's
not so fashionable among AGI people.

I agree that amenability-to-NLP should be an important consideration of any
AGI design process.  We've designed Novamente specifically so that it will
be able to learn language when the time comes.  Our experience working with
computational linguistics at Webmind allowed us to do that.

On the other hand, Peter Voss has often put forth the following view (this
is a paraphrase, not a quote): "Our brains are a lot like chimps' brains.
If someone designed an AGI with chimp level intelligence, making the
modifications to turn this chimp-AGI into a human-level AGI with linguistic
ability would be a *relatively* small trick compared to the original trick
of designing the chimp-AGI."

I think Peter has a certain point, but it's not the approach I'm taking.  My
approach is more linguistic than his but less than yours.

Your approach puts linguistics at the center, it seems; but I don't think
you can FOUND an AGI system on linguistics.  I think linguistic ability has
to mostly emerge from more generic cognitive functionality, in order to be
full and genuine linguistic ability that involves deep semantic
understanding....

-- Ben G

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/

Reply via email to