--- On Sun, 6/22/08, Kaj Sotala <[EMAIL PROTECTED]> wrote:

> On 6/21/08, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> > Eliezer asked a similar question on SL4. If an agent
> flips a fair quantum coin and is copied 10 times if it
> comes up heads, what should be the agent's subjective
> probability that the coin will come up heads? By the
> anthropic principle, it should be 0.9. That is because if
> you repeat the experiment many times and you randomly
> sample one of the resulting agents, it is highly likely
> that will have seen heads about 90% of the time.
> 
> That's the wrong answer, though (as I believe I pointed out when the
> question was asked over on SL4). The copying is just a red
> herring, it doesn't affect the probability at all.
> 
> Since this question seems to confuse many people, I wrote a
> short Python program simulating it:
> http://www.saunalahti.fi/~tspro1/Random/copies.py

The question was about subjective anticipation, not the actual outcome. It 
depends on how the agent is programmed. If you extend your experiment so that 
agents perform repeated, independent trials and remember the results, you will 
find that on average agents will remember the coin coming up heads 99% of the 
time. The agents have to reconcile this evidence with their knowledge that the 
coin is fair.

It is a tricker question without multiple trials. The agent then needs to model 
its own thought process (which is impossible for any Turing computable agent to 
do with 100% accuracy). If the agent knows that it is programmed so that if it 
observes an outcome R times out of N that it would expect the probability to be 
R/N, then it would conclude "I know that I would observe heads 99% of the time 
and therefore I would expect heads with probability 0.99". But this programming 
would not make sense in a scenario with conditional copying.

Here is an equivalent question. If you flip a fair quantum coin, and you are 
killed with 99% probability conditional on the coin coming up tails, then, when 
you look at the coin, what is your subjective anticipation of seeing "heads"?


-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com

Reply via email to