Link-bait title aside he's not wrong, but misses the bigger picture.

The key is Salience. What do I mean by that? A mechanism by which we can
represent what is important.

 Something that causes pain has high salience, whereas a dull patch of wall
has low salience.

In a biological entity, things that hurt, feed, and arouse are innately
salient. That innateness comes from billions of years of evolution
selecting for some basic neural and motor pathways.

What's salient to a computer intelligent might be completely different, but
to make decisions, to exert attention, to learn from reinforcement and
punishment, there are some basic systems that need to be in place. The
author seems to understand that even if dismissing the whole for a lack of
a few parts is silly.

Hence my thesis that we should try building embodied intelligence with
needs, goals, etc, to use all the rest of what biology has taught us in
implementing these other features.



On Oct 3, 2013 2:59 PM, "Matthew Taylor" <[email protected]> wrote:

>
> http://opaqueparcels.com/2013/09/30/the-brain-as-a-model-for-computers-why-jeff-hawkins-wont-lead-us-significantly-closer-intelligent-machines/
>
> Any comments? ;-)
>
> ---------
> Matt Taylor
> OS Community Flag-Bearer
> Numenta
>
> _______________________________________________
> nupic mailing list
> [email protected]
> http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
>
_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to