On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra <smi...@zonnet.nl> wrote:

On 24-04-2022 03:16, Bruce Kellett wrote:

A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
case
that one has twice the probability of the other. In the long run,
if
both are realized they have equal probabilities of 1/2.

The probabilities do not have to be 1/2.  Suppose one million people

participate in a lottery such that there will be exactly one winner.
The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different

persons. If we then create one million almost identical persons, the

probability is still one one in a million. This means that the limit
of
identical persons, the probability will be one in a million.

Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?

Your lottery example is completely beside the point.

It provides for an example of a case where your logic does not apply.

I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and
11; after 3 trials, we have branches registering 000, 001, 011, 010,
100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and standard
deviation that decreases as 1/sqrt(N). In other words, the majority of
trials will have equal, or approximately equal, numbers of 0s and 1s.
Observers in these branches will naturally take the probability to be
approximated by the relative frequencies of 0s and 1s. In other words,
they will take the probability of each outcome to be 0.5.


The problem with this is that you just assume that all branches are equally probable. You don't make that explicit, it's implicitly assumed, but it's just an assumption. You are simply doing branch counting.


The important point to notice is that this result of all possible
binary sequences for N trials is independent of the coefficients in
the binary expansion of the state:

      |psi> = a|0> + b|1>.

Changing the weights of the components in the superposition does not
change the conclusion of most observers that the actual probabilities
are 0.5 for each result. This is simple mathematics, and I am amazed
that even after all these years, and all the times I have spelled this
out, you still seek to deny the obvious result. Your logical and
mathematical skill are on a par with those of John Clark.


It's indeed simple mathematics. You apply that to branch counting to arrive at the result of equal probabilities. So, the conclusion has to be that one should not do branch counting. The question is then if this disproves the MWI. If by MWI we mean QM minus collapse then clearly not. Because in that case we use the Born rule to compute the probability of outcomes and assume that after a measurement we have different sectors for observers who have observed the different outcomes with the probabilities as given by the Born rule.

You then want to argue against that by claiming that your argument applies generally and would not allow one to give different sectors unequal probabilities. But that's nonsense, because you make the hidden assumption of equal probabilities right from the start. There is nothing in QM that says that branches must count equally, and the lottery example I gave makes it clear that you can have branching with unequal probabilities in classical physics.

Saibal



Bruce

 --
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/CAFxXSLT22raNMhxUsrUqHni2P-T4Ww%3DXQh_HKUO7CBpTZv8q_Q%40mail.gmail.com
[1].


Links:
------
[1]
https://groups.google.com/d/msgid/everything-list/CAFxXSLT22raNMhxUsrUqHni2P-T4Ww%3DXQh_HKUO7CBpTZv8q_Q%40mail.gmail.com?utm_medium=email&utm_source=footer

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/352f2df7fc14c6bfa795f68e8b9b6e2e%40zonnet.nl.

Reply via email to