On Wednesday 21 February 2007 15:34, Eric Baum wrote:
> Josh> The other idea in OI worth noting is Mountcastle's Principle,
> Josh> that all of the cortex seems to be doing the same thing. ...

> [Hawkins is] basically asserting that what's wired in is unimportant,
> but this neglects the computational learning theory,
> which indicates it is.

Well, I can't speak for Hawkins, but I think there's room for plenty of 
ontological levels in between the homogeneity and the learning bias. As you 
(Eric) pointed out, evolution likes to reuse basic designs with variations. I 
find it unremarkable that there might be a basic computational substrate in 
cortex that would look more or less the same to a neuroscientist under a 
microscope. 
Further we know that areas responsible for control and sensing move around, 
and grow and shrink, with use or disuse. This strikes me as nothing more than 
load-balancing. 

The question is (a) whether the computational fabric itself has properties 
that we would find useful for AI directly, or should we just use a big 
Beowulf and look a few levels higher up for inspiration? (b) if it does have 
useful properties, what are they?

My own guess is that there are some associative and parallel operations in the 
substrate that we'd like to have in hardware once we get them figured out. 

As for learning bias, there's a nasty bias in classical AI that is left over 
from the sectarian rift between AI and cybernetics in the 50's. I happen to 
think that science has come up with a few useful representations, and that 
math, diff eq's, and systems theory are excellent building blocks. I'm 
willing to give these, plus logic and statistics, a shot before being 
convinced that we need something new.

Getting representations right is tough but I'm also willing to work under the 
assumption that the brain does a fair amount of it by brute force, i.e. yes 
it's NP-hard, but with some heuristics and some teraflops you can get close 
enough often enough.

Josh

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to