Probably offtopic - while I am reading about OpenCog community efforts in 
NLP, I am quite suspicious about statistical methods. I think that the only 
meaningful approach to the NLP ir the combinatory categorial grammars 
(Lambek calculus, Montague semantics) and this effort tries to translate 
natural language sentences into logical expressions - lambda calculus 
expressions. So - if there is connection between Schema as a language of 
lambda calculus, then CCGs are the way of translating NL sentences directly 
into Scheme structures. Besides CCGs approach uses white box approach and 
understanding for the semantics of natural language, these semantical 
knowledge can also be encoded as the Scheme/OpenCog structures and can be 
learned of enhanced by time.

Of course, raw statistical approach in the end can give the same results, 
but structured approach can be more feasible. Besides - statistical 
approach yields results that are worth all or nothing. But CCG approach 
yields results that are improving step by step and such improving 
understanding reflects the human approach to the world and language - 
humans progresively learns language, its syntax and semantics. I we have 
the slightest doubts about existence of the perfect understanding of the 
language then we should also must have doubts about efficiency of the 
statistical approach.

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/393b88c8-aadd-456c-bd84-eaac92b55fd8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to