interesting you're attempting that via goals, because goals will mutate;  one 
alternative is to control the infrastructure eg have systems that die when 
they've run a certain course., and watcher systems that check mutations.
> ----- Original Message -----
> From: "Kaj Sotala" <[EMAIL PROTECTED]>
> To: [email protected]
> Subject: Re: [agi] Goal Driven Systems and AI Dangers [WAS Re: Singularity 
> Outcomes...]
> Date: Sun, 2 Mar 2008 19:58:28 +0200
> 
> 
> On 2/16/08, Richard Loosemore <[EMAIL PROTECTED]> wrote:
> > Kaj Sotala wrote:
> >  > Well, the basic gist was this: you say that AGIs can't be constructed
> >  > with built-in goals, because a "newborn" AGI doesn't yet have built up
> >  > the concepts needed to represent the goal. Yet humans seem tend to
> >  > have built-in (using the term a bit loosely, as all goals do not
> >  > manifest in everyone) goals, despite the fact that newborn humans
> >  > don't yet have built up the concepts needed to represent those goals.
> >  >
> > Oh, complete agreement here.  I am only saying that the idea of a
> >  "built-in goal" cannot be made to work in an AGI *if* one decides to
> >  build that AGI using a "goal-stack" motivation system, because the
> >  latter requires that any goals be expressed in terms of the system's
> >  knowledge.  If we step away from that simplistic type of GS system, and
> >  instead use some other type of motivation system, then I believe it is
> >  possible for the system to be motivated in a coherent way, even before
> >  it has the explicit concepts to talk about its motivations (it can
> >  pursue the goal "seek Momma's attention" long before it can explicitly
> >  represent the concept of [attention], for example).
> 
> Alright. But previously, you said that Omohundro's paper, which to me
> seemed to be a general analysis of the behavior of *any* minds with
> (more or less) explict goals, looked like it was based on a
> 'goal-stack' motivation system. (I believe this has also been the
> basis of your critique for e.g. some SIAI articles about
> friendliness.) If built-in goals *can* be constructed into
> motivational system AGIs, then why do you seem to assume that AGIs
> with built-in goals are goal-stack ones?
> 
> >  The way to get around that problem is to notice two things.  One is that
> >  the sex drives can indeed be there from the very beginning, but in very
> >  mild form, just waiting to be kicked into high gear later on.  I think
> >  this accounts for a large chunk of the explanation (there is evidence
> >  for this:  some children are explictly thinking engaged in sex-related
> >  activities at the age of three, at least).  The second part of the
> >  explanation is that, indeed, the human mind *does* have trouble making a
> >  an easy connection to those later concepts: sexual ideas do tend to get
> >  attached to the most peculiar behaviors.  Perhaps this is a sigh that
> >  the hook-up process is not straightforward.
> 
> This sounds like the beginnings of the explanation, yes.
> 
> 
> 
> --
> http://www.saunalahti.fi/~tspro1/ | http://xuenay.livejournal.com/
> 
> Organizations worth your time:
> http://www.singinst.org/ | http://www.crnano.org/ | http://lifeboat.com/
> 
> -------------------------------------------
> agi
> Archives: http://www.listbox.com/member/archive/303/=now
> RSS Feed: http://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: 
> http://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com

>


-- 
Want an e-mail address like mine?
Get a free e-mail account today at www.mail.com!

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to