Hi, That would be awesome. Thanks Jeff Hawkins. Matt Taylor : That is so cool. I would love to read about every new temporal pooler hypothesis. I have a working prototype of CLA that I built recently. Its exactly what CLA is in white papers and nupic videos. Implementation is little different from Nupic. eg: spatial pooler is called connection_matrix in that prototype. Its a matrix with random connections. Connections in between cells are stored in a matrix and learning literally strengthens those connections. SoN idea i proposed, I am planning on trying it out with this CLA prototype. As far as I have hypothesized, it does form a higher level representation. one whole sentence is represented by few cells in SDR[3]. Plus the integral motor implementation also is one of my plan. SO instead of having different CLAs, the same ones would talk back. I would be able to demonstrate higher level representation of a complete sentence, with different representations for every word. eg: HOW ARE YOU will have a different pattern then ARE HOW YOU. sequence is important. Plus I am excited to have completely understood the idea of semantic similarity with shared bits. This prototype will have some of the bits of "HOW ARE YOU" if the sentence it gets is "WHAT ARE YOU" as both share "ARE YOU" and that too in same sequence.
That SoN idea I wrote was hard for you to follow because I was incapable of putting it in a simple way. Its very simple and intuitive. Once new version is ready I will update my post. Regards Aseem Hegshetye _______________________________________________ nupic mailing list [email protected] http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
