• concrete vs "coarse" - there's an ambiguity in the use of "coarse". Even if the
hologram loses information - through reduction and/or forgetting, what remains is still *fine-grained*,
concrete. It's only coarse if we allow some of the grit to be more important than other grit, some
compositions to be ratchets or longer surviving, some aspect over other aspects. A diamond would last longer
than a lump of coal. But both diamonds and lumps of coal are fine-grained, by some lens, at least at their
edges - diamonds cut glass after all. So the word/concept we want isn't "coarse". There should be a
better word/concept.
• "incompressible" - similar problem. Both "coarse" and "compression" imply a
kind of essentialism (or idealism?), a preference for a *model* over its referent (model-ism?).
• "structure" is in the right direction.
• But I doubt criticality as *the* locus for a reliable indexing structure. It
seems to pass the buck to [in]stability, however that's defined. Maybe I even
doubt *indexing* entirely? Am I rejecting the axiom of choice? IDK.
<apophenia>
There seems to be a consensus that the COVID (lockdown?) induced learning loss
in math (and reading) for K-12 is worse than other subjects. Why? I think most
people believe math learning/understanding is staged, sequential, hierarchical,
*structural*, baffled [⛧], etc. whereas subjects like social studies are
thin/flat and can be convexly sampled.
I don't buy it. When I was younger, I read *a lot*. I read entire books and
journals where I felt like I understood maybe 50% of the words (ignoring stop
words of course). Sure, this prevented me from understanding the composition.
But you don't need to understand orchestra in order to play guitar by the fire.
To boot, my analysis prof regularly peppered us, primed us, with concepts
beyond our ken.
I think this conception of math as a baffled multi-space is a gatekeeping construct. And
that we continue teaching/assessing it that way is a better hypothesis for why those poor
kids have to pay for "remedial" math classes when they get to college.
</apophenia>
Given that, criticality seems to be just another thing we can include or not
while binning things into {remember, forget}. IDK, though. Thanks for the post.
[⛧] Baffled meaning something like little sieves/gates/screens everywhere and
of different types such that a seeker can get stuck in any compartment
unless/until they change their shape/size/malleability so they can extrude
through the baffle.
On 12/11/25 10:08 AM, Jon Zingale wrote:
Contrasting the finite memory of local spacetime with the seemingly
discontinuous behavioral transitions a dynamical system undergoes as it moves
away from criticality suggests philosophical complications worth exploring. In
what follows, the working premise is that the universe unfolds along a
non‑Turing‑computable, Chaitin‑random process, and that any finite region of
spacetime has only finite informational fidelity, subject to physical limits on
storage and erasure.
While this does not denigrate the successes of objective scientific
pursuits—those requiring the preemptive registration of well‑defined objects
such as particles, waves, or distributions—it does invite equal attention to
the questions raised by Bekenstein, Kolmogorov, Chaitin, Per Martin‑Löf, and
others about the information‑theoretic limits of spacetime and the possibility
of incompressible informational drivers acting on a finite and dissipative
spacetime. On this view, no bounded region can serve as a perfect, indefinitely
faithful ledger of its own local history; at best, it carries a finite, coarse
record, continually rewritten by ongoing dynamics.
One idea to explore is that if an information‑dissipative universe is driven by a
non‑Turing‑computable, algorithmically random process, then not only could we find ourselves in a
universe that appears as ours does, but we would also find ourselves in a universe that is, at
every moment, as actively and incompressibly specified as at any other. A "universal"
model—say, near criticality—can characterize classes of possible behaviors and their scaling
structure, but it cannot, even in principle, decide which exact Chaitin‑random continuation is
being realized; identification of the singular sequence underlying "our" unfolding is
formally undecidable.
Another consequence concerns the ontological status and recoverability of the past. If we
belong to an information‑productive and dissipative universe with only finite local
fidelity, then the past not only disappears from the present in the naive sense, but the
detailed evidential trace of past micro‑events is not guaranteed to persist within any
bounded region. Finite information capacity and the thermodynamic cost of erasure imply
that, over time, records are overwritten or degraded, so that only a coarse pattern of
correlations remains available for reconstruction, and even these correlations are
ultimately irrecoverable from the truncated "spacetime memory".
I suppose this is a long winded way of saying that the fidelity of a given
medium may or may not faithfully support the data one consumes and hopes to
build models/reconstructions from. For the sake of providing analogy, a couple
images I have in mind are:
1. Tracking price fluctuations in a market for a given security (with its few
decimal places) is unlikely to provide a sufficiently rich foundation for the
complete reconstruction of the underlying price mechanism.
2. The noise floor of an instant photograph reveals more about the sensor, the
grains, and chemistry of the film than about the target of the image itself.
These examples illustrate the broader concern: finite, noisy media do not, in
general, encode a complete, invertible history of the processes that shaped
them. In a universe driven by an incompressible process and constrained by
information‑theoretic limits, this becomes a structural feature rather than a
mere practical nuisance.
It seems reasonable, then, to ask how physical laws are encoded in spacetime,
and whether they are just as much a side effect of renormalization and
coarse‑graining as color is. Criticality can be viewed as the locus of
genuinely universal structure, where many microscopically distinct systems
share a single fixed point description, while symmetry breaking “chooses a
singular,” re‑grounding the system in a particular contingent history that the
universal theory does not uniquely fix. In such a setting, effective laws and
apparent conservation principles look less like timeless global book‑keeping
rules and more like emergent regularities that hold within specific regimes,
supported by finite‑fidelity records rather than a perfect global archive.
Practically, it seems scientifically productive to develop methodologies for measuring,
or at least bounding, the dissipation and erasure rates implied by these physical limits.
Doing so would help clarify how far reconstructions of the past, confidence in
"laws", and appeals to universality can be pushed in a universe whose unfolding
may be fundamentally non‑computable and whose memory is ultimately finite.
--
¡sıɹƎ ןıɐH ⊥ ɐןןǝdoɹ ǝ uǝןƃ
ὅτε oi μὲν ἄλλοι κύνες τοὺς ἐχϑροὺς δάκνουσιν, ἐγὰ δὲ τοὺς φίλους, ἵνα σώσω.
.- .-.. .-.. / ..-. --- --- - . .-. ... / .- .-. . / .-- .-. --- -. --. / ...
--- -- . / .- .-. . / ..- ... . ..-. ..- .-..
FRIAM Applied Complexity Group listserv
Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom
https://bit.ly/virtualfriam
to (un)subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives: 5/2017 thru present https://redfish.com/pipermail/friam_redfish.com/
1/2003 thru 6/2021 http://friam.383.s1.nabble.com/