On 5/8/2022 5:25 PM, Bruce Kellett wrote:
On Mon, May 9, 2022 at 10:17 AM Brent Meeker <meekerbr...@gmail.com> wrote:

    On 5/8/2022 3:42 PM, Bruce Kellett wrote:
    On Mon, May 9, 2022 at 6:37 AM smitra <smi...@zonnet.nl> wrote:

        On 08-05-2022 05:58, Bruce Kellett wrote:

        > It is when you take the SE to imply that all possible
        outcomes exist
        > on each trial. That gives all outcomes equal status.

        All outcomes can exist without these being equally likely.
        One can make
        models based on more branches for certain outcomes, but these
        are just
        models that may not be correct.


    Such models are certainly inconsistent with the SE. So if your
    concern is that the SE does not contain provision for a collapse,
    then you should doubt other theories that violate the SE. You
    can't have it both ways: you can't reject collapse models because
    they violate the SE and then embrace other models that also
    violate the SE. Either the SE is universally correct, or it is not.

        What matters is that such models can be
        formulated in a mathematically consistent way, which
        demonstrates that
        there is n o contradiction. The physical plausibility of such
        models is
        another issue.


    This has been discussed. To allow for real number probabilities,
    the number of branches on each split must be infinite.

    I don't think that's a problem.  The number of information bits
    within a Hubble sphere is something like the area in Planck units,
    which already implies the continuum is a just a convenient
    approximation.  If the area is N then something order 1/N would be
    the smallest non-zero probability.  Also there would be a cutoff
    for the off-diagonal terms of the density matrix.  Once all the
    off-diagonal terms are zero then it's like a mixed matrix and one
    could say that one of the diagonal terms has "happened".


As I have pointed out before, a finite number of branches does not work because after a certain finite number of splits, one would run out of branches to partition in anything like the way appropriate for the related probabilities. One cannot go adding more branches at that stage without rendering the whole concept meaningless. Keeping things finite has its attractions, but it does not work in this case.

I think it depends on how you count splits.  If the number of dof within a Hubble volume is finite, then the number of splits doesn't grow exponentially.  They get cut off when their probability becomes too small.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/2a4f9f73-6474-781a-4d74-13080476f9b7%40gmail.com.

Reply via email to