On Thursday 12 June 2008 02:48:19 am, William Pearson wrote:

> The kinds of choices I am interested in designing for at the moment
> are should program X or program Y get control of this bit of memory or
> IRQ for the next time period. X and Y can also make choices and you
> would need to nail them down as well in order to get the entire U(x)
> as you talk about it.
> 
> As the function I am interested in is only concerned about
> programmatic changes call it PCU(x).
> 
> Can you give me a reason why the utility function can't be separated
> out this way?


This is roughly equivalent to a function where the highest-level arbitrator 
gets to set the most significant digit, the programs X,Y the next most, and 
so forth. As long as the possibility space is partitioned at each stage, the 
whole business is rational -- doesn't contradict itself.

Allowing the program to play around with the less significant digits, i.e. to 
make finer distinctions, is probably pretty safe (and the way many AIers 
envisioning doing it). It's also reminiscent of the way Maslow's hierarchy 
works.

But it doesn't work for full fledged AGI. Suppose you are a young man who's 
always been taught not to get yourself killed, and not to kill people (as top 
priorities). You are confronted with your country being invaded and faced 
with the decision to join the defense with a high liklihood of both. 

If you have a fixed-priority utility function, you can't even THINK ABOUT the 
choice. Your pre-choice function will always say "Nope, that's bad" and 
you'll be unable to change. (This effect is intended in all the RSI stability 
arguments.) 

But people CAN make choices like this. To some extent it's the most important 
thing we do. So an AI that can't won't be fully human-level -- not a true 
AGI.

Josh


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to