On 2/28/2022 7:14 PM, Jesse Mazer wrote:

On Mon, Feb 28, 2022 at 7:39 PM Brent Meeker <meekerbr...@gmail.com>wrote:On 2/28/2022 3:39 PM, Jesse Mazer wrote:On Mon, Feb 28, 2022 at 6:12 PM Brent Meeker <meekerbr...@gmail.com> wrote: On 2/28/2022 1:12 PM, Jesse Mazer wrote:Superdeterminism goes well beyond Laplacean determinism. Determinism is just about the dynamical laws--if you know some "initial" state of the universe at time T1, it says you can perfectly predict the state at a later time T2 (or an earlier time, in a time-symmetric theory). Superdeterminism is a constraint on the initial conditions which is meant to rule out some broad class of possible worlds that are *not* ruled out by the dynamical laws.In a deterministic system any given initial condition rules out infinitely many futures. Yes, the conditional probability P(later conditions B | initial conditions A) is 1 for a unique value of B, 0 for every other possible value of B. But the dynamical laws themselves don't tell you anything about the non-conditional probability P(initial conditions A) for different possible choices of A. Superdeterminism adds an extra constraint which says P(initial conditions A) is 0 for the vast majority of possible initial conditions in the phase space, and only nonzero for a tiny fraction with some very special characteristics.But if the universe is deterministic it had only /*one*/ initial condition...so of course it had special characteristics. Just as the winning lottery ticket had a special number on it.But if you don't know that initial condition, then absent knowledge ofsome lawlike constraint on initial conditions, I think it makes senseto treat all initial microstates consistent with the historical datayou've seen so far as equally likely in terms of the subjectiveprobability you assign to them (this sort of assumption is needed inclassical statistical mechanics, where to make probabilisticpredictions about an isolated system, you generally start with theassumption that all microstates consistent with your knowledge of themacrostate are equally likely). So even if Bell inequalities have beenconsistently violated in the past, if you believe that's just aconsequence of a particular "lucky" set of initial conditions and notthe dynamical laws or a lawlike constraint on initial conditions, thenif you believe the dynamical laws are local ones you should expect thepattern to break down in the future, since there are many morepossible initial microstates consistent with the experimental resultsyou've seen so far in which the pattern of Bell inequality violationswould break down and the inequalities would subsequently be respected.

`I agree. And if that happens I guess it will be (weak) support for`

`superdeterminism.`

In quantum theory, superdeterminism is invoked to allow for the possibility that the dynamical laws are local realist ones (of a single-world kind), so that under "generic" initial conditions one would expect statistically to see Bell inequalities respected (in contradiction to quantum predictions), but superdeterminism constrains the initial conditions to a special setThen postulating that the initial conditions were in this set seems like just another dynamical law; like Born's rule. Can you elaborate on the analogy to Born's rule? Born's rule is not a constraint on initial states.Born's rule for measurement results is not a dynamical law either.I would say that in the Copenhagen interpretation the experimenter'schoice about what to measure is not determined by dynamical laws, butonce the state of the detector is set, the interaction between thedetector and the quantum system being measured does obey a dynamicallaw, one that says the system's wavefunction will collapse onto one ofthe eigenstates of whatever variable the detector is set to measure(the projection postulate) with probability determined by the squareof the prior amplitude on that eigenstate (Born's rule).In any case, if you don't consider Born's rule to be any sort of truedynamical law, were you saying it "seems like" a dynamical law in somesense, and that the constraint on initial conditions "seems like" adynamical law in the same sense?

`I'm pointing out it could be imitated by superdeterminism even though`

`it's used as a law in QM. It's analogous to computer programs; there's`

`really no sharp distinction between program and data.`

Even if we accept in principle the idea of laws that consist of constraints on allowable initial conditions, there is also the argument that the mathematical formulation of such a constraint would have to be incredibly complex in an algorithmic sense,Why? "No hidden variable" isn't very complex.Are you interpreting superdeterminist theories as ones where there areno hidden variables?

`No, I'm saying no hidden variables rules out superdeterminism. Since`

`either their are hidden variables that are sensitive to the polarization`

`settings, or the polarization settings are influenced by the hidden`

`variables. But if there are no hidden variables...no superdeterminism.`

Unless superdeterminism is assumed to measurably depart from thepredictions of QM, it does require hidden variables--the idea in aBell test measurement involving spin measurements, for example, isthat the particle pair have hidden variables which predetermine whatspins they will have along the axes the experimenters will laterchoose to measure.that it would have to have some built-in "concept" of high-level observers and measuring instruments so that the hidden variables could be assigned to particle pairs in a way that anticipated the fact that the two particles would later be measured by instruments in a certain configuration (the orientation of stern-gerlach devices used to measure each particle's spins, for example).But in a deterministic system all those things have a common cause; their past light cones overlap.The event of the particle pair being emitted from the source is in thepast light cone of each of the measurements, but each experimentercould for example base their decision about what axis to measure on apseudorandom algorithm that took as its seed some astronomical datafrom a region that's outside the past light cone of the pair emission,

`That's three past light cones that must not overlap to rule out a common`

`cause violation of statistical independence. That means they need to be`

`more that 14 billion light years apart.`

Brent

and also outside the past light cone of the other experimenter makingtheir own decision. And the hidden variables assigned to the particleswhen they're emitted have to act as though they "anticipate" whatmeasurements will be performed on them, even if the choice ofmeasurements depended on information outside the past light cone ofthe emission event.

-- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/807c943d-c056-db35-655f-5f9bff4234f7%40gmail.com.