Well, there is no doubt that hardwiring can have a significant effect on a system's propensity to take various kinds of actions
 
However, the human urge to reproduce is a case in point of the ability of a complex self-organizing AI system to act against its innate propensities. 
 
We humans nearly all have a strong urge to do the wild thang, and yet, some people choose celibacy... some people "sublimate" the desire into other pursuits ... etc. etc.
 
ben
-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of Kevin Copple
Sent: Sunday, January 12, 2003 9:41 AM
To: [EMAIL PROTECTED]
Subject: RE: [agi] Ethics for AGIs

Ben Goertzel wrote:

 

***

 

I am not bullish on the possiblity of reliably hard-wiring any specific content into a real AGI system.... 

 

***

 

By way of counterexample, consider me.  I have found that I am high-priority hardwired to martial my resources to reproduce, at least for one particular step in the process. I suppose I could control myself more than I do.  Or I could use contraceptives to thwart the urge.  But as an AGI (without the “A”), let me tell you that I AM hardwired!

 

Surely we could make an AGI that would hardwired also to have the urge not to “use contraception,” just as I suspect evolution would do with the human race given a few generations.  But if the AGI is designed with unlimited morphing abilities, by definition it is not susceptible to this hardwiring.  

 

It seems the issue is one of “type of AGI,” not whether hard-wiring is possible in ANY type of AGI.  Assuming of course you admit that I am an AGI (I have always been uncomfortable with “artificial” used in conjunction with intelligence anyway, so may as well use it on myself to be consistent).

 

Kevin Copple

Reply via email to