Sergio,
In the PAM-P2 system, there are Associator processes which create associations. 
 The only human necessary is the designer.
Breaking channel (modality) dependence just means creating rules for intermodel 
associations.  That also can be programmed.
PAM-P2 is under construction.  Stay tuned.


------------------------------------------------------------------------------------------------------------------------------------------------Confidential
 - This message is meant solely for the intended recipient. Please do not copy 
or forward this message without the consent of the sender. If you have received 
this message in error, please delete the message and notify the sender.

> From: [email protected]
> To: [email protected]
> Subject: RE: [agi] Building high-level features using large scale 
> unsupervised learning
> Date: Fri, 29 Jun 2012 10:52:11 -0500
> 
> Alan,
> 
> You make the same traditional mistake that many make. You offer a
> substantial scientific contribution obtained by humans using their human
> intelligence, but never explain the process in their brains that made it
> possible for them to do all that. Jeff Hawkins (On Intelligence) does the
> same. His entire book revolves around certain "invariant representations"
> but he never explains where those come from. Your proposal is the usual
> mixture of man-made intelligence and promises. It does not contribute to
> artificial intelligence. 
> 
> The keywords are "association" and "emerge." You write:
> 
> "The next challenge is that you need to break channel dependence and
> introduce associations between patterns ie with faces and the various
> representations of the word "face". I suspect that once channel dependence
> is fixed, then, at some high level in the network, these associations will
> emerge on their own. " 
> 
> "Introduce associations" means, in your mind, that you or some other human
> will be using their human intelligence, not artificial intelligence, to find
> those associations and introduce them in the program. "Emerge" indicates you
> realize that something is still missing and blame something else that you
> can't explain. You are not alone. 
> 
> 
> Sergio
> 
> 
> -----Original Message-----
> From: Alan Grimes [mailto:[email protected]] 
> Sent: Friday, June 29, 2012 8:20 AM
> To: AGI
> Subject: Re: [agi] Building high-level features using large scale
> unsupervised learning
> 
> Ben Goertzel wrote:
> > How exactly do you suggest to bridge the functionality gap between 
> > visual pattern recognition and all the other things human beings do?
> 
> =)
> 
> Setting aside problems noted as still being unsolved, here's a crude sketch
> of how the system can be organized. For the sake of brevity, only the
> cortical-thalamic-cortical system will be considered.
> 
> The first thing to note is that this is an unsupervised pattern learner.
> That should be pretty amazing all by itself. The second thing to note is
> that all it deals with are vectors of numbers. There is no reason on earth
> that it can't be made to work with any conceivable stimulus that can be
> encoded as a vector of numbers. There are some serious channel dependence
> problems, previously noted, but the basic process is present.
> 
> The third thing to note is that they could run their matrix stack in reverse
> and "imagine" what a face looks like. This is critical, especially for motor
> control! =P
> 
> This is your basic algorithm. The next challenge is that you need to break
> channel dependence and introduce associations between patterns ie with faces
> and the various representations of the word "face". I suspect that once
> channel dependence is fixed, then, at some high level in the network, these
> associations will emerge on their own.
> 
> The next issue is topology. You could organize the topology like the human
> brain and, in theory, it should be human equivalent. Motor control is
> implemented just like perception. It builds up complex sequences of actions
> from simple sequences of actions exactly as complex perceptions are built up
> from simple perceptions. To do something, you just run the stack in reverse,
> as mentioned above. Combined with channel dependence and free association,
> you obtain arbitrary sequences of planned actions.
> Actions that are fully learned become habitual (simply initiate the top
> level abstraction). Other actions require an iterative system-wide process
> for planning, but most of the mechanisms are already present.
> 
> You obtain episodic memory by having a pipeline that associates concurrent
> perceptions, which appears to be what the hypocampus does.
> 
> To obtain super-human intelligence, you need to make the topology of the
> system adaptive, or even accessible to the system itself. Ideally, you want
> a highly redundant, highly distributed, highly parallel and highly efficient
> architecture. This architecture does have a second class of scalability
> issues, each matrix, at each level of abstraction is of fixed size, There
> needs to be a process that simplifies and consolidates knowledge to a more
> ideal representation. At that point you're off the edge of the
> (metaphorical) napkin I sketched this all out on. =P
> 
> About 80% of everything else you need is already available off the shelf,
> the other 20% might have some important, perhaps even difficult, challenges
> but then we're talking about emotions and motivation instead of
> intelligence.
> 
> --
> E T F
> N H E
> D E D
> 
> Powers are not rights.
> 
> 
> 
> 
> 
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/18883996-f0d58d57
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> d2
> Powered by Listbox: http://www.listbox.com
> 
> 
> 
> 
> 
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/19999924-5cfde295
> Modify Your Subscription: https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to