Your exploration is going away from my initial question, why do goals get 
differing priorities.
But I'll bite:  We can consider a situation to be a list of elements as follows:
(prototype Situation   :Items { } )
Your question: How with the AGI program differentiate the overgeneralizations 
that it will tend to develop as it responds to various situations.
Let's develop a framework with which to answer it.
Let's assume an action has preconditions and post conditions. 
(prototype Action   :Preconditions { }  :Steps { }  :Postconditions { } )

For now, we'll ignore the steps and postconditions and focus on the 
preconditions.
So now we can say that a precondition represents a situation just in case the 
precondition items are the same as those in a situation.
  (given [Situation ^ ?S :Items ?items]               [Action ^ ?A  (same 
@:Preconditions ?items) ]       do              (print 'the preconditions of 
action ' ?A ' match the situation ' ?S) )
The same goes for the postconditions of action S  
Suppose we have a set of situations.  And a set of actions.  
We can say that the items in the precondition of an action are more concrete 
than the items in a situation if the the precondition contains more items than 
the situation.
[Situation ^ situation_1 :Items { quick brown fox } ] [Action ^ Action_1 
:Preconditions { quick brown fox jumped } ... ]
We can say that the items in the precondition of an action are more abstract 
than the items in a situation if the the precondition contains fewer items than 
the situation.
[Situation ^ situation_1 :Items { quick brown fox } ] [Action ^ Action_1 
:Preconditions { quick brown } ... ]
We can say that the items in the precondition of an action are more specific 
than the items in a situation if the the precondition contains  items that are 
lower in a taxonomy than those in the situation.
[Situation ^ situation_1 :Items { quick brown fox } ] [Action ^ Action_1 
:Preconditions { quick dark-brown fox } ... ][Type :Subtype dark-brown 
:Supertype brown ]

We can say that the items in the precondition of an action are more general 
than the items in a situation if the the precondition contains  items that are 
higeher in a taxonomy than those in the situation.
[Situation ^ situation_1 :Items { quick brown fox } ] [Action ^ Action_1 
:Preconditions { quick brown animal } ... ][Type :Subtype fox :Supertype animal 
]

So can you please refine your question at this point, so that it may be 
answered.
How will the AGI program differentiate the overgeneralizations that it will 
tend to develop as it responds to various situations.
Cheers,
~PM
Date: Tue, 30 Sep 2014 20:53:01 -0400
Subject: Re: [agi] Are all goals created equal?
From: [email protected]
To: [email protected]

On Sun, Sep 28, 2014 at 12:17 AM, Piaget Modeler via AGI <[email protected]> 
wrote:
You create an AGI and endow it with an initial set of actions and needs. You 
prioritize the needs so that, for example, power is more important than signal 
strength. Beyond that, you plug it in and let it run.  The AGI begins to create 
goals and subgoals in order to satisfy its needs.
But how does it prioritize these goals and subgoals? We can distinguishurgent 
from important goals by creating separate attributes for urgencyand priority. 
But why would one goal get a higher priority than another, aside frominheriting 
its priority from a basic need?  I think the terms that you are using to 
characterize the problem are over-generalized.  I guess that when I complain 
about the necessity of examining questions like this with more detailed 
characterizations that people who understand what I am trying to say simply 
dismiss it as obvious. The AGI program would be designed to use (and to learn 
to use) actions under appropriate conditions and every programmer 'gets' what 
conditional actions are.. But perhaps the problem that I see is that we need to 
write these programs so that they will use broad generalizations (just as, 
according to my point of view, you wrote your message using broad 
generalizations.)  So then the question of how the AGI program will 
differentiate the over-generalizations that it will tend to develop as it 
responds to various situations becomes the more essential question. So, for 
example, trying to get a better understanding of a particular situation and 
using that to further shape the characterization of that 'kind' of situation 
may become a prioritized goal just because it may be (or seem like) a necessary 
step that the program needs to take to better understand the situation. To add 
a little to this view of how an AGI might work notice that a situation may be 
characterized as being like different -kinds- of situations. This basic insight 
is something that I never read in other people's comments - Is that because it 
is so obvious?  A situation is comprised of the 'components' of the situation 
and each of these may lead to the development of different kinds of situations 
which the particular situation may be likened to. And the situation may belong 
to other categories that are based on the projection of some other kind of idea 
onto it.Jim Bromer

On Sun, Sep 28, 2014 at 12:17 AM, Piaget Modeler via AGI <[email protected]> 
wrote:



You create an AGI and endow it with an initial set of actions and needs.
You prioritize the needs so that, for example, power is more important than 
signal strength. Beyond that, you plug it in and let it run. 
The AGI begins to create goals and subgoals in order to satisfy its needs.
But how does it prioritize these goals and subgoals? We can distinguishurgent 
from important goals by creating separate attributes for urgencyand priority. 
But why would one goal get a higher priority than another, aside frominheriting 
its priority from a basic need? 
Perhaps I lack imagination.  
Your thoughts? 
~PM                                       


  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  








  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to