Re: The Nature of Contingency: Quantum Physics as Modal Realism

```On Tue, Apr 26, 2022 at 10:03 AM smitra <smi...@zonnet.nl> wrote:

> On 24-04-2022 03:16, Bruce Kellett wrote:
>
> > A moment's thought should make it clear to you that this is not
> > possible. If both possibilities are realized, it cannot be the case
> > that one has twice the probability of the other. In the long run, if
> > both are realized they have equal probabilities of 1/2.
>
> The probabilities do not have to be 1/2.  Suppose one million people
> participate in a lottery such that there will be exactly one winner. The
> probability that one given person will win, is then one in a million.
> Suppose now that we create one million people using a machine and then
> organize such a lottery. The probability that one given newly created
> person will win is then also one in a million. The machine can be
> adjusted to create any set of persons we like, it can create one million
> identical persons, or almost identical persons, or totally different
> persons. If we then create one million almost identical persons, the
> probability is still one one in a million. This means that the limit of
> identical persons, the probability will be one in a million.
>
> Why would the probability suddenly become 1/2 if the machine is set to
> create exactly identical persons while the probability would be one in a
> million if we create persons that are almost, but not quite identical?
>```
```
Your lottery example is completely beside the point. I think you should pay
more attention to the mathematics of the binomial distribution. Let me
explain it once more: If every outcome is realized on every trial of a
binary process, then after the first trial, we have a branch with result 0
and a branch with result 1. After two trials we have four branches, with
results 00, 01, 10,and 11; after 3 trials, we have branches registering
000, 001, 011, 010, 100, 101, 110, and 111. Notice that these branches
represent all possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all possible
binary sequences of length N. (This is just like Pascal's triangle) As N
becomes very large, we can approximate the binomial distribution with the
normal distribution, with mean 0.5 and standard deviation that decreases as
1/sqrt(N). In other words, the majority of trials will have equal, or
approximately equal, numbers of 0s and 1s. Observers in these branches will
naturally take the probability to be approximated by the relative
frequencies of 0s and 1s. In other words, they will take the probability of
each outcome to be 0.5.

The important point to notice is that this result of all possible binary
sequences for N trials is independent of the coefficients in the binary
expansion of the state:

|psi> = a|0> + b|1>.

Changing the weights of the components in the superposition does not change
the conclusion of most observers that the actual probabilities are 0.5 for
each result. This is simple mathematics, and I am amazed that even after
all these years, and all the times I have spelled this out, you still seek
to deny the obvious result. Your logical and mathematical skill are on a
par with those of John Clark.

Bruce

--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email