On 4/28/07, J. Storrs Hall, PhD. <[EMAIL PROTECTED]> wrote:
I disagree with this two ways. First, it's fairly well accepted among
mainstream AI researchers that full NL competence is "AI-complete", i.e. that
human-level intelligence is a prerequisite for NL.

I don't think this is the operational sense of NLP as pursued by
applying linguistic theories in narrow AI setting. (e.g. Dynamic
Syntax, DRT, HPSG, ...)

Secondly, even the parsing
part of NLP is part of a more general recursive sequence
understander/generator, which is used for doing complex tasks with the hands
(and the conjecture is that language bootstrapped itself on this capability).

I was writing in context of Mark Waser language-specific solutions (as
I understand them), which if wished could be later reused in boarder
contexts.

In other words, although there is enough special-purpose hardware in there to
make it make sense to call language a "module", the full capability is so
interwoven with general cognition that it can't be separated across a
bottleneck.

Josh

We stumble here on the meaning of capacity in this context. For
example, a general GUI library is not expected to be generally
intelligent.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to