On 19-05-2020 03:51, Bruce Kellett wrote:
On Tue, May 19, 2020 at 11:16 AM 'Brent Meeker' via Everything List
<everything-list@googlegroups.com> wrote:

On 5/18/2020 4:28 PM, Bruce Kellett wrote:

I am sorry if I have given the impression that I thought that
objective probabilities were possible only with frequentism. I
thought I had made it clear that frequentism fails as a basis for
the meaning of probability. There are many places where this is
argued, and the consensus is that long-run relative frequencies
cannot be used as a  definition of probability.

I was appealing to the propensity interpretation, which says that
probabilities are intrinsic properties of some things.; such as
decay rates; i.e., that probability is an intrinsic property of
radio-active nuclei. But I agree with Brent, probabilities can be
taken to be anything that satisfies the basic axioms of
probability theory -- such as non-negative, normalisable, and
additive. So subjective degrees of belief can form the basis for
probabilities, as can certain symmetry properties, relative
frequencies, and so on.

The point is that while these things can be understood as
probabilities in ordinary usage, they don't actually define what
probability is. One can use frequency counts to estimate many of
these probabilities, and one can use Bayes's theorem to update
estimates of probability based on new evidence. But Bayes's
theorem is merely an updating method -- it is not a definition of
probability. People who consider themselves to be Bayesians
usually have a basically subjective idea about probability,
considering it essentially quantifies personal degrees of belief.
But that understanding is not inherent in Bayes' theorem itself.

As Brent says, these different approaches to probability have
their uses in everyday life, but most of them are not suitable for
fundamental physics. I consider objective probabilities based on
intrinsic properties, or propensities, to be essential for a
proper understanding of radio-active decay, and the probability of
getting spin-up on a spin measurement, and so on. These things are
properties of the way the world is, not matters of personal
belief, or nothing more than relative frequencies. Probabilities
may well be built into the fabric of the quantum wave-function via
the amplitudes, but the probabilistic interpretation of these
amplitudes has to be imposed via the Born rule:  Just as with any
mathematical theory -- one needs correspondence rules to say how
the mathematical elements relate to physical observables. From
that point of view, attempts to derive the Born rule from within
the theory are doomed to failure -- contrary to the many-worlders'
dream, the theory does not contain its own interpretation.

But even if you're right (and I think you are) does that affect the
MWI.  In an Everett+Born theory there will still be other worlds and
the interpretation will still avoid the question, "When and where is
a measurement?"...answer "Whenever decoherence has made one state
orthogonal to all other states."   Of course we could then as the
question, "When and where has the wave function collapsed?" and give
the same answer.  Which would be CI+Zurek.

Does this analysis affect the MWI? I think it does, because if
probabilities are intrinsic, and the Born rule is merely a rule that
connects the theory with experiment;  not something that can be
derived from within the theory, then we are left with the tension that
I mentioned a while ago between many-worlds and the probability
interpretation. Everett says that every outcome happens, but separate
outcomes are in separate "relative states", or separate, orthogonal,
branches. This can only mean that the probability is given by branch
counting, and the probabilities are necessarily 50/50 for the
two-state case. Then the probability for each branch on repeated
measurements of an ensemble of N similarly prepared states is 1/2^N,
and those probabilities are independent of the amplitudes. This is
inconsistent with the analysis that gives the probability for each
branch of the repeated trials by the normal binomial probability, with
p equal to the mod-squared amplitude. (This is the analysis with which
I began this thread.) The binomial probabilities are experimentally
observed, so MWI is, on this account, inconsistent with experiment
(even if not actually incoherent).

The question of the preferred basis and the nature of measurement is
answered by decoherence, and the collapse of the wave function is
another interpretative move. On this view, the wave function is
epistemic rather that ontological. The ontology is the particles or
fields that the wave function describes, and these live on ordinary
3-space. This is not necessarily CI+Zurek, because Zurek still thinks
he can derive probabilities from within the theory, and CI does not
encompass decoherence and the quantum nature of everything. Nor is it
Qbism, since that idea does not really encompass objective
probabilities.

Bruce

It's implausible that a fundamental theory can have a macroscopic concepts like "environment", "experiment" etc. in it build in. That's what motivated the MWI in the first place. Now, the MWI may not be exactly correct as QM may itself only be an approximation to a more fundamental theory. But the approach of trying to address interpretational problems by relying on macroscopic concepts is a priori doomed to fail.

Saibal

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/73678e98212373b40febe7f94903f66a%40zonnet.nl.

Reply via email to