Abram,
I'm giving a talk in a couple weeks at CISIS-12 entitled "Patterns for 
Cognitive Systems."In the paper I discuss a number of patterns and their 
functional elements.  Attention is anelement of the Motivation pattern. 
So, if one is implementing the "Motivation" pattern, then you need an attention 
component to reprioritize your goals (because, by definition, that component is 
part of the pattern.)  
Circular argument, but it suffices. The good news is that if you're not 
implementing motivation the you can exclude it from your architecture.
The bottom line becomes, what is the architecture of your cognitive system?  In 
this specific architecture, PAM-P2, I have an attention component, so I'm 
tryiing to  discern useful inputs to the components beyond what is prescribed 
by the pattern. 
Is PAM-P2 the best architecture, by no means.  It's just one architecture which 
implementsspecific patterns.  
For me the question of GraphPlan versus SHOP2 versus CBP is just one of 
functionality for a specific component within a pattern, namely, the planner.  
In fact, for that architectural element, the planner, one should be able to 
seamlessly replace the implementation. This means that at some point I can use 
a Graphplan planner, or swap it out for a case-based planner, or for an HTN 
planner, if the interfaces are consistently defined.

Now back to the question at hand...
~PM. 

Date: Tue, 12 Jun 2012 17:50:24 -0700
Subject: Re: [agi] Attention
From: [email protected]
To: [email protected]

PM,

Allow me to ask a question...
Should there be such a component?
This is my own view, not necessarily strongly supported by the field, but... 
the way I see it, the literature on classical planning is a graveyard of 
algorithms based on goal-subgoal hierarchies. Researchers try very hard to get 
this sort of an approach to work (and succeed), because that is the way we 
intuitively think we think. However, that doesn't mean it is the best approach. 
A more bottom-up & global approach may work better.

Hierarchical planners were soundly defeated by Graphplan in 1999, an algorithm 
which treated planning as a constraint satisfaction problem (using a standard 
solver for those problems) rather than attempting to solve in a pseudo-human 
way. Hierarchical planners came back, using graphplan as a sub-component, but I 
see that as more a result of the sticktoitiveness of the hierarchical planning 
community than a fundamental result.

That's in the domain of classical planning. Outside that domain, things like RL 
and monte-carlo planning are used; subgoal hierarchies do not exist...
This could well be for lack of sophistication. Maybe the current 'flat' 
techniques will be replaced by hierarchical planning in time, even in very 
messy domains. (I know that hierarchical RL has been implemented more than 
once, with different approaches...) Certainly I think an advanced AGI system 
will need the capability to think about plans on multiple levels of 
abstraction. However, we don't know "where" in the mental architecture this 
occurs. What I'm saying (or, what I think I'm saying) is that it doesn't seem 
right to have it at the lowest level. Subgoal prioritization may be a rather 
advanced form of reasoning which emerges as a result of a lot of lower-level 
stuff.

Of course, something to direct reasoning at the low level is needed! However, 
it might be a very "different" algorithm from what you expect... Really, I'm 
just trying to encourage some creative thinking here. :)

--Abram
On Mon, Jun 11, 2012 at 5:31 PM, Piaget Modeler <[email protected]> 
wrote:






Abram,  you've characterized it properly. In my vernacular subgoals = goals. 

I would say that the job of this particular attention module is to reprioritize 
the open goal set,given all available information.  


So the question for me is what should all available information consist of?  
Some candidates are:   (1) The current context, for sure, (2) alerts, (3) 
expectation failures and mismatches,

(4) past prioritizations, (5) past episodes.
Anything else? 
Your thoughts? 



                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to