Aseem,
I will be giving an informal talk on this topic at the next hackathon, but
in brief, very brief...

The CLA today has no motor component.  It is like an ear listening to sounds
but with no ability to interact with the world.  Most sensory perception is
not like that.  Most of the changes on our sensors come from our own
actions.  Imagine standing in a house.  If your eyes couldn't move and your
body couldn't move you would not be able to learn what the house is like.
You couldn't learn the patterns in the world.  Only be moving do you
discover the structure of the house.  Movement leads to sensory changes.
The brain learns sensorimotor patterns.  "when I see this and turn left I
will see that".  The same is true for touch.  Even hearing is largely
controlled by our own motions.  The only thing I am hearing right now is the
sounds of the keys on my keyboard.  My cortex is predicting to hear those
sounds.  If they changed even slightly I would notice the difference.

Motor behavior is how we learn most of the structure of the world.
Jeff

-----Original Message-----
From: nupic [mailto:[email protected]] On Behalf Of Aseem
Hegshetye
Sent: Friday, October 25, 2013 5:37 AM
To: [email protected]
Subject: [nupic-dev] motor implementation

Hi,
Jeff Hawkins said he is working on sensorimotor design.
How will implementation of motor layer 5 help in data prediction.
Would it be like CLA signalling anomalies like cortex gives motor commands
or are you planning on manipulating some parameters at user end based on the
predictions from given inputs.
thanks
Aseem Hegshetye

_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org


_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to