2008/6/12 J Storrs Hall, PhD <[EMAIL PROTECTED]>:
> On Thursday 12 June 2008 02:48:19 am, William Pearson wrote:
>
>> The kinds of choices I am interested in designing for at the moment
>> are should program X or program Y get control of this bit of memory or
>> IRQ for the next time period. X and Y can also make choices and you
>> would need to nail them down as well in order to get the entire U(x)
>> as you talk about it.
>>
>> As the function I am interested in is only concerned about
>> programmatic changes call it PCU(x).
>>
>> Can you give me a reason why the utility function can't be separated
>> out this way?
>
>
> This is roughly equivalent to a function where the highest-level arbitrator
> gets to set the most significant digit, the programs X,Y the next most, and
> so forth. As long as the possibility space is partitioned at each stage, the
> whole business is rational -- doesn't contradict itself.

Modulo special cases, agreed.

> Allowing the program to play around with the less significant digits, i.e. to
> make finer distinctions, is probably pretty safe (and the way many AIers
> envisioning doing it). It's also reminiscent of the way Maslow's hierarchy
> works.
>
> But it doesn't work for full fledged AGI.

It is the best design I have at the moment, whether it can make what
you want is another matter. I'll continue to try to think of better
ones. It should get me a useful system if nothing else, and hopefully
more people interested in the full AGI problem, if it proves
inadequate.

What path are you going to continue down?

> Suppose you are a young man who's
> always been taught not to get yourself killed, and not to kill people (as top
> priorities). You are confronted with your country being invaded and faced
> with the decision to join the defense with a high liklihood of both.

With the system I am thinking of it can get stuck in positions that
aren't optimal as the the program control utility function only
chooses from the extant programs in the system. It is possible for the
system to be dominated by a monopoly or cartel of programs, such that
the program chooser doesn't have a choice. This would only happen if
there was a long period of stasis and a very powerful/useful set of
programs. Such as possibly patriotism or the protection of other
sentients in this case, being very useful during peace time.

This does seem like you would consider it a bug, and it might be. It
is not one I can currently see a guard against.

  Will Pearson


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to