On Wed, 21 Feb 2007, Richard Loosemore wrote:

) Aki Iskandar wrote:
) > I'd be interested in getting some feedback on the book "On Intelligence"
) > (author: Jeff Hawkins).
) > 
) > It is very well written - geared for the general masses of course - so it's
) > not written like a research paper, although it has the feel of a thesis.
) > 
) > The basic premise of the book, if I can even attempt to summarize it in two
) > statements (I wouldn't be doing it justice though) is:
) > 
) > 1 - Intelligence is the ability to make predictions on memory.
) > 2 - Artificial Intelligence will not be achieved by todays computer chips
) > and smart software.  What is needed is a new type of computer - one that is
) > physically wired differently.
) > 
) > 
) > I like the first statement.  It's very concise, while capturing a great deal
) > of meaning, and I can relate to it ... it "jives".
) > 
) > However, (and although Hawkins backs up the statements fairly convincingly)
) > I don't like the second set of statements.  As a software architect
) > (previously at Microsoft, and currently at Charles Schwab where I am writing
) > a custom business engine, and workflow system) it scares me.   It scares me
) > because, although I have no formal training in AI / Cognitive Science, I
) > love the AI field, and am hoping that the AI puzzle is "solvable" by
) > software.
) > 
) > So - really, I'm looking for some of your gut feelings as to whether there
) > is validity in what Hawkins is saying (I'm sure there is because there are
) > probably many ways to solve these type of challenges), but also as to
) > whether the solution(s) its going to be more hardware - or software.
) > 
) > Thanks,
) > ~Aki
) > 
) > P.S.  I remember a video I saw, where Dr. Sam Adams from IBM stated
) > "Hardware is not the issue.  We have all the hardware we need".   This makes
) > sense.  Processing power is incredible.  But after reading Hawkins' book, is
) > it the right kind of hardware to begin with?
) 
) For the time being, it is the software (the conceptual framework, the high
) level architecture) that matters most.
) 
) If someone has naive views about the AGI problem, about the various issues
) that must be relevant to the design of a thinking system (like, if they have
) no comprehensive knowledge of both cognitive science and AI, among other
) things), and yet that person has really strong views about the hardware that
) we MUST use to build an intelligent system, what I hear is "Hey, I don't know
) exactly what you guys are doing, but I know you need THIS!".   Hmmmm. Just
) keep banging the rocks together.
) 
) Having said that, there is an element of truth in what Hawkins says.  My
) personal opinion is that he has only a fragment of the truth, however, and is
) mistaking it for the whole deal.
) 
) 
) Richard Loosemore.

I like what Hawkins has to say, despite his regailing against A.I., which 
most everyone does.  We just keep trucking along, making new ways to think 
about the brain...  The simple hierarchical learning and inference system 
that he describes as "the neocortex with the hippocampus sitting on top" 
is absurdly simplistic, and his ideas of that vague and mystical word 
Consciousness are pretty far out--can't say I agree with much of that.

I feel Ramachandran is a good reference for what is it to perceive, which 
is my current best understanding of Consciousness...

Bo

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to