On 4/7/2022 8:00 PM, smitra wrote:
On 07-04-2022 18:25, Brent Meeker wrote:
On 4/7/2022 8:25 AM, smitra wrote:
On 07-04-2022 03:05, Bruce Kellett wrote:
On Thu, Apr 7, 2022 at 10:52 AM smitra <smi...@zonnet.nl> wrote:

On 07-04-2022 02:30, Bruce Kellett wrote:

The preferred basis is not determined by algorithms -- it is
determined by robustness under decoherence. You can redefine
everything so that your theory is no longer quantum mechanics --
but
that is a fairly pointless exercise.

That's the preferred basis as used in practice. But that's useless
in
this context and it would amount to doing things  things backward.
Observers cannot be defined using decoherence. That you do
robustness
under decoherence allows for us as stable observers to exist. So
decoherence explains our existence.

It also explains our ability to make measurements and record results.
To claim that this is not a fundamental account of measurement is just
silly. Nothing is more fundamental than quantum entanglement evidenced
in decoherence.


While entanglement is a phenomenon that exists at the fundamental level, effective macroscopic concepts can never be fundamental, they have to be explained using the fundamental microscopic theory in which many macroscopic concepts do not even exist.

You need to keep in mind that there are different meanings of
"fundamental".  Those "macroscopic concepts" like measurements and
records and facts are epistemically fundamental; and remain so however
theories change.  The reductionist base of the current theory is
ontologically fundamental, but it may be replaced by a new theory with
a different ontology, as QM replaced Newtonian mechanics and
statistical mechanics replaced thermodynamics.  Being ontologically
fundamental is a precarious position.


Yes, and that means that the new theory must reduce effectively to the old theory in the macroscopic regime where the old theory makes (almost) correct predictions.

That incorrectly insinuates that the old theory only makes accurate predictions in the macroscopic domain.  For example, I suspect that the solution to the problem of quantum gravity will imply natural cutoffs at the Planck scale which we now often invoke heuristically.  Just because an ontology is microscopic doesn't make it immune to replacement.  Strings or loop quantum gravity are microscopic too.

If we then ask fundamental questions about e.g. the existence of a multiverse that can only be addressed by getting the details about the dynamics at the microscopic level correct, then it's not appropriate to fix up the theory by introducing notions from the macroscopic domain that should in principle follow from the fundamental dynamics at the micro-level.

The notion of "result" and "measurement" are not introduced, they are fundamental to knowledge.  They are exactly where MWI gets into trouble.  By saying there is no result of an experiment it muddles the concept of probability.

Brent


The appearance of permanent records should follow from decoherence. But it makes sense to consider states of algorithms that process information as a more general notion of observation.

Saibal


Brent


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/5e7ff009-cdcd-742b-659b-7770119eae35%40gmail.com.

Reply via email to