Very interesting. I understand that opencog has diverse applications but 
what I wanted to ask was if going from subsymbolic to symbolic 
representation is one of the central goals. I am not an expert but it 
seemed to me that doing this is the key to general intelligence. Am I wrong 
when I say that? 

On Wednesday, February 22, 2017 at 2:27:02 AM UTC+5:30, Alex wrote:
>
> There is boom of neural network translation. Maybe it is possible to 
> extract formal grammars, symbolic NLP processors from the neural networks 
> that are trained for translation...
>

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/0d24d9e7-978c-4164-adfc-7abf479c352a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to