Bob Mottram wrote:
On 18/02/2008, Richard Loosemore <[EMAIL PROTECTED]> wrote:
... might be true. Yes, a motivation of some form could be coded into
the system, but the paucity of expression in the level at which it is
coded, may still allow for "unintended" motivations to emerge out.


It seems that in the AGI arena much emphasis is put on designing goal
systems.  But in nature behavior is not always driven explicitly by
goals.  A lot of behavior I suspect is just drift, and understanding
this requires you to examine the dynamics of the system.  For example
if I'm talking on the phone and doodling with a pen this doesn't
necessarily imply that I explicitly have instantiated a goal of "draw
doodle".  Likewise within populations changes in the gene pool do not
necessarily mean that explicit selection forces are at work.

My supposition is that the same dynamics seen in natural systems will
also apply to AGIs, since these are all examples of complex dynamical
systems.

Ooops: the above quote was attached to my name in error: I believe Harshad wrote that, not I.


But regarding your observation, Bob: I have previously avocated a distinction between "diffuse motivation systems" and "goal-stack systems". As you say, most AI systems simply assume that what controls the AI is a goal stack.

I will write up this distinction on a web page shortly.



Richard Loosemore

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to