Mark Waser <[EMAIL PROTECTED]> wrote:

>> All rational goal-seeking agents must have a mental state of maximum utility 
>> where 
any thought or perception would be unpleasant because it would result in a 
different state.

>I'd love to see you attempt to prove the above statement.
 
>What if there are several states with utility equal to or very close to the 
>maximum?

Then you will be indifferent as to whether you stay in one state or move 
between them.

>What if the utility of the state decreases the longer that you are in it 
>(something that is *very* true of human 
beings)?

If you are aware of the passage of time, then you are not staying in the same 
state.

>What if uniqueness raises the utility of any new state sufficient 
that there will always be states that are better than the current state (since 
experiencing uniqueness normally improves fitness through learning, 
etc)?

Then you are not rational because your utility function does not define a total 
order. If you prefer A to B and B to C and C to A, as in the case you 
described, then you can be exploited. If you are rational and you have a finite 
number of states, then there is at least one state for which there is no better 
state. The human brain is certainly finite, and has at most 2^(10^15) states.

 -- Matt Mahoney, [EMAIL PROTECTED]


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to