Moshe Looks (originator of the MOSES subsystem of OpenCog) and some
colleagues at Google have a new release,

https://research.googleblog.com/2017/02/announcing-tensorflow-fold-deep.html

It's fairly subtle but the crux as I currently understand it is, it
provides a way to make it more efficient to train tensorflow models
over graphs rather than vectors...

This could have some uses in the OpenCog universe, e.g. if we wanted
to do clustering of little trees or sub-hypergraphs or whatever, we
could perhaps try unsupervised LSTM in Tensorflow for unsupervised
classification...

As one example, this could be used to cluster words into "part of
speech" categories based on various (syntactic and semantic) data
associated with the word, in the simplest case using TreeLSTM....
Whether this would work better than EM clustering or GP clustering or
other stuff I have no idea though...

-- Ben



-- 
Ben Goertzel, PhD
http://goertzel.org

“I tell my students, when you go to these meetings, see what direction
everyone is headed, so you can go in the opposite direction. Don’t
polish the brass on the bandwagon.” – V. S. Ramachandran

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/CACYTDBcXHNv5AAXyaKH1nJWA_s6m19MLJoJQFegjJzFRRaMM6w%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to