On 4/26/2022 5:32 PM, smitra wrote:
On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra <smi...@zonnet.nl> wrote:
On 24-04-2022 03:16, Bruce Kellett wrote:
A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
that one has twice the probability of the other. In the long run,
both are realized they have equal probabilities of 1/2.
The probabilities do not have to be 1/2. Suppose one million
participate in a lottery such that there will be exactly one
probability that one given person will win, is then one in a
Suppose now that we create one million people using a machine and
organize such a lottery. The probability that one given newly
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
identical persons, or almost identical persons, or totally
persons. If we then create one million almost identical persons,
probability is still one one in a million. This means that the
identical persons, the probability will be one in a million.
Why would the probability suddenly become 1/2 if the machine is
create exactly identical persons while the probability would be
million if we create persons that are almost, but not quite
Your lottery example is completely beside the point.
It provides for an example of a case where your logic does not
I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01,
11; after 3 trials, we have branches registering 000, 001, 011,
100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.
After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and
deviation that decreases as 1/sqrt(N). In other words, the
trials will have equal, or approximately equal, numbers of 0s and
Observers in these branches will naturally take the probability to
approximated by the relative frequencies of 0s and 1s. In other
they will take the probability of each outcome to be 0.5.
The problem with this is that you just assume that all branches are
equally probable. You don't make that explicit, it's implicitly
assumed, but it's just an assumption. You are simply doing branch
But it shows why you can't use branch counting. There's no
mechanism for translating the _a_ and _b_ of _|psi> = a|0> + b|1>_
into numbers of branches. To implement that you have put it in "by
hand" that the branches have weights or numerousity of _a _and _b_.
This is possible, but it gives the lie to the MWI mantra of "It's
the Schroedinger equation."