On Monday, March 11, 2019 at 7:43:54 PM UTC-6, John Clark wrote:
>
>
> On Mon, Mar 11, 2019 at 8:42 PM Lawrence Crowell <[email protected] 
> <javascript:>> wrote:
>
> > all the radiation emitted is entangled with the black hole, which would 
>> then mean the entanglement entropy increases beyond the Bekenstein bound. 
>
>
>
> Could nature be trying to tell us that the Bekenstein bound is simply 
> wrong and spacetime is contentious and can store information at scales 
> even smaller than the Planck area? After all as far as I know there is no 
> experimental evidence the Bekenstein bound exists or that spacetime ends 
> when things get smaller than 10^-35 meters.
>
> John K Clark
>

Warning, this is a bit long, but I hope informative and interesting. John's 
question pertains to the Planck scale and Bekenstein bound. Really the 
issue of quantum information and the firewall is on scales considerably 
larger. I do address some conundrums with the Planck scale towards the end.

As with the analogue of the thermal cavity the entanglement of radiation 
emitted shifts from radiation entangled with the cavity or photon emitting 
hot atoms, to entanglement between photons. Photons previously emitted and 
entangled with atoms, then become entangled with subsequent photons emitted 
by these atoms. It is interesting how entanglement is really all around us, 
but it is mostly not controlled and is an aspect of thermodynamics. Anyway 
this occurrence happens at a time called the Page time, after Don Page who 
first identified this. As this happens when around half the photons are 
emitted, the same happens with black holes. When about half the mass of a 
black hole has been emitted as Hawking radiation about half of its initial 
mass. The time it takes a black hole (BH) to quantum decay completely is 
proportional to the cube of the mass, which means the black hole has 
emitted half its mass in 7/8ths of its expected duration.

This means that when a black hole is reduced to half of its original mass 
the bipartite entangled photons with the BH emitted a long time ago, for a 
solar mass black hole some 10^{67}years, are now entangled with not only 
the BH, but with newly emitted photons. This is a big problem. This is 
telling us there is a difficulty in making entanglement entropy fit with 
the Bekenstein bound and that bipartite entanglements are transformed into 
tripartite entanglements. This means quantum unitarity fails. This is not 
something people are willing to abandon so easily, so what AMPS [Almheiri, 
D. Marolf, J. Polchinski, J. Sully, "Black holes: complementarity or 
firewalls?". JHEP. $\bf 2$, (2013). arXiv:1207.3123] proposed was that 
instead of losing quantum unitarity maybe the equivalence principle of 
general relativity fails. This means the BH becomes a sort of naked 
singularity at the horizon, called the firewall, where anything that enters 
is just demolished or "burned up" as it would in the interior of a BH.

If quantum mechanics builds up spacetime as entanglements, or equivalently 
if spacetime is an emergent phenomenon of quantum mechanics (QM), then the 
unitarity of QM and the equivalence principle (EP) of general relativity 
(GR) may be either equivalent in some way or that they share a duality. If 
we think about it the Einstein field equation 

R_{μν} - ½ Rg_{μν} = (8πG/c^4)T_{μν}

Tells us that weak gravitation on the left side of the equal sign is equal 
to strongly interacting stuff on the right. In a quantum mechanical setting 
the left hand side is quantum mechanical at extreme energy or the UV, while 
the right hand side is all around us at low or moderate energy or the IR. 
There is then a duality between quantum gravitation at extreme energy vs 
quantum field theory at lower energy. 

The holographic principle of black holes indicates that any system that 
approaches a black hole becomes less localized as seen by an asymptotic 
observer. The optical lensing of spacetime spreads any wave function or for 
that matter a local field amplitude across the near horizon region. Quantum 
field theory with its assumptions of Wightman conditions to remove quantum 
nonlocality may no longer be applicable. These were imposed in part to 
remove nonlocal quantum physics, which in high energy is on a very small 
scale from the physics one observes with detectors on a larger scale. 

The best thing to come out of superstring theory is Maldecena's 
correspondence between the anti-de Sitter spacetime of dimension N with the 
conformal field theory on the boundary in N - 1 dimensions. This gives me a 
sense that superstring theory has maybe far less to do with TeV scale 
physics and a lot more to do with quantum cosmology. In effect this 
connects a global physics of cosmology in the bulk of an AdS spacetime with 
the local conformal field theory on the boundary with one dimension less. 
This is a quantum spacetime version of the Gauss-Bonnet theorem! If one 
expands the AdS action S = ∫d^4x\sqrt{-g}R with R_{abcd}R^{abcd} as 
instantons and dual terms you get the Euler and Hirzebruch characteristics. 
Then in the AdS/CFT correspondence the difference between the topological 
numbers from quantum gravity in the nonlocal AdS bulk and the local 
topological numbers on the boundary is zero. Fantastic, if you think about 
it!

The connection between locality and nonlocality defines both the dS and AdS 
spacetimes. The AdS spacetime is one part of hyperboloid on two sheets, and 
the dS  one sheet. 

http://www.network-graphics.com/images/math/hyper_parts_m.jpg

In the momentum-energy representation these meet at I^± in momentum-energy 
spacetime with the Planck scale. So the dS spacetime is a sort of patching 
of two AdS's with the transition to positive Λ, which in turn has two 
causal regions. Hence a holographic screen with a positive junction in 
AdS_n will contain a dS_{n-1}. Since these all connect to the physics of 
the boundary CFT, I think this may constrain the physics. This provides me 
with the motivation at least to think that spacetime and quantum 
information are much the same. The loss of the EP is just a sort of 
transformation of spacetime information (largely in the form of curvature 
etc) to quantum bits, and the converse can occur. This then motivates a 
further development on how this may happen with Hawking radiation the 
avoids these problems to a larger degree. There is though an issue of 
conformal invariance and breaking of conformal symmetry that still lingers. 

I could go on a lot more on this with how the firewall relates to extremal 
BH and BPS BH. It also relates to quantum error correction codes and the 
Hamming distance. If you have a library where books are not reshelved 
regularly then when about half the books become irregularly stacked off 
their duly appointed shelves it becomes much harder the reshelve them. This 
is a limit on an error correction, and the Page time or firewall is related 
to this.

Now to the issue of the Planck scale. The Planck scale really tells us 
there is a cut-off scale for locating a quantum bit. This is a scale where 
a black hole radius r = 2GM/c^2 is equal to the Compton wavelength of the 
black hole, λ = ħ/Mc. Just equate 2λ = r, the 2 is from a Nyquist 
requirement, to get M = sqrt{ħc/G}. The Heisenberg uncertainty ΔEΔt ≈ ħ to 
get the Planck time and then get the Planck length ℓ_p = sqrt{Għ/c^3} and 
find this tiny distance ℓ_p = 1.6×10^{-33}cm. This is odd in a way, for 
spacetime physics, particularly if we are to think of matter and fields as 
derived from spacetime, should be conformally invariant. This means there 
is no scale where the physics is different, which occurs with masses and 
their Compton wavelength where their physics is very different. We have 
this contradiction of sorts! Strangely I have not seen anyone make a fuss 
over this. So something is indeed odd here.

>From an experimental perspective we know that the occurrence of γ-rays and 
optical photons from burstars billions of light years distant is 
simultaneous. If spacetime were sliced and diced into Planck units we would 
expect a dispersion of photons that is frequency dependent. This is then 
falsified. This measurement is not a direct measurement; high energy 
collisions at the Planck scale are not invoked, but rather spacetime on a 
far IR scale is examined where small fluctuations there will have some 
effect. So this does  not definitively rule out the business of the 
nonlocalizability of a qubit on scales smaller than a Planck unit of 
distance, area or volume. 

There is then a serious issue here, and it is related to the firewall 
problem. The Bekenstein bound really should apply, for it if breaks down we 
then have a huge host of horrors in physics. This would means timelike 
loops are possible and so forth and causality is lost. I think causality 
has a dualism with entanglement symmetries, but that is for later. 

LC

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to