---------- Forwarded message ---------
From: Astral Codex Ten <[email protected]>
Date: Tue, Sep 10, 2024 at 1:07 AM
Subject: Contra DeBoer On Temporal Copernicanism
To: <[email protected]>


...
͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
  ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
    ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
  ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
    ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
  ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
    ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
  ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
    ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
  ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
    ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
  ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏
    ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­͏     ­
Forwarded this email? Subscribe here
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly93d3cuYXN0cmFsY29kZXh0ZW4uY29tL3N1YnNjcmliZT91dG1fc291cmNlPWVtYWlsJnV0bV9jYW1wYWlnbj1lbWFpbC1zdWJzY3JpYmUmcj02eDNubiZuZXh0PWh0dHBzJTNBJTJGJTJGd3d3LmFzdHJhbGNvZGV4dGVuLmNvbSUyRnAlMkZjb250cmEtZGVib2VyLW9uLXRlbXBvcmFsLWNvcGVybmljYW5pc20iLCJwIjoxNDg2MDk3MjAsInMiOjg5MTIwLCJmIjp0cnVlLCJ1IjoxMTYyMjA4MywiaWF0IjoxNzI1OTQ0ODQ2LCJleHAiOjE3Mjg1MzY4NDYsImlzcyI6InB1Yi0wIiwic3ViIjoibGluay1yZWRpcmVjdCJ9.GFNrY1FAI9w4uKZG__Do5w0599fa2kLA_Ay_sWIlGSs?>
for more
Contra DeBoer On Temporal Copernicanism
<https://substack.com/app-link/post?publication_id=89120&post_id=148609720&utm_source=post-email-title&utm_campaign=email-post-title&isFreemail=true&r=6x3nn&token=eyJ1c2VyX2lkIjoxMTYyMjA4MywicG9zdF9pZCI6MTQ4NjA5NzIwLCJpYXQiOjE3MjU5NDQ4NDYsImV4cCI6MTcyODUzNjg0NiwiaXNzIjoicHViLTg5MTIwIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.g9jUvWBTggt0Tqa-uAfvxa-xK8K8swnma8ld0bdm3CE>
...

Sep 10

<https://substack.com/app-link/post?publication_id=89120&post_id=148609720&utm_source=substack&isFreemail=true&submitLike=true&token=eyJ1c2VyX2lkIjoxMTYyMjA4MywicG9zdF9pZCI6MTQ4NjA5NzIwLCJyZWFjdGlvbiI6IuKdpCIsImlhdCI6MTcyNTk0NDg0NiwiZXhwIjoxNzI4NTM2ODQ2LCJpc3MiOiJwdWItODkxMjAiLCJzdWIiOiJyZWFjdGlvbiJ9.A7nBeQAGTVGcUn8co4-MIcVJG-bl_wwUsY5JEaoMV5s&utm_medium=email&utm_campaign=email-reaction&r=6x3nn>
<https://substack.com/app-link/post?publication_id=89120&post_id=148609720&utm_source=substack&utm_medium=email&isFreemail=true&comments=true&token=eyJ1c2VyX2lkIjoxMTYyMjA4MywicG9zdF9pZCI6MTQ4NjA5NzIwLCJpYXQiOjE3MjU5NDQ4NDYsImV4cCI6MTcyODUzNjg0NiwiaXNzIjoicHViLTg5MTIwIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.g9jUvWBTggt0Tqa-uAfvxa-xK8K8swnma8ld0bdm3CE&r=6x3nn&utm_campaign=email-half-magic-comments&action=post-comment&utm_source=substack&utm_medium=email>
<https://substack.com/app-link/post?publication_id=89120&post_id=148609720&utm_source=substack&utm_medium=email&utm_content=share&utm_campaign=email-share&action=share&triggerShare=true&isFreemail=true&r=6x3nn&token=eyJ1c2VyX2lkIjoxMTYyMjA4MywicG9zdF9pZCI6MTQ4NjA5NzIwLCJpYXQiOjE3MjU5NDQ4NDYsImV4cCI6MTcyODUzNjg0NiwiaXNzIjoicHViLTg5MTIwIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.g9jUvWBTggt0Tqa-uAfvxa-xK8K8swnma8ld0bdm3CE>
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9vcGVuLnN1YnN0YWNrLmNvbS9wdWIvYXN0cmFsY29kZXh0ZW4vcC9jb250cmEtZGVib2VyLW9uLXRlbXBvcmFsLWNvcGVybmljYW5pc20_dXRtX3NvdXJjZT1zdWJzdGFjayZ1dG1fbWVkaXVtPWVtYWlsJnV0bV9jYW1wYWlnbj1lbWFpbC1yZXN0YWNrLWNvbW1lbnQmYWN0aW9uPXJlc3RhY2stY29tbWVudCZyPTZ4M25uJnRva2VuPWV5SjFjMlZ5WDJsa0lqb3hNVFl5TWpBNE15d2ljRzl6ZEY5cFpDSTZNVFE0TmpBNU56SXdMQ0pwWVhRaU9qRTNNalU1TkRRNE5EWXNJbVY0Y0NJNk1UY3lPRFV6TmpnME5pd2lhWE56SWpvaWNIVmlMVGc1TVRJd0lpd2ljM1ZpSWpvaWNHOXpkQzF5WldGamRHbHZiaUo5Lmc5alV2V0JUZ2d0MFRxYS11QWZ2eGEteEs4Szhzd25tYThsZDBiZG0zQ0UiLCJwIjoxNDg2MDk3MjAsInMiOjg5MTIwLCJmIjp0cnVlLCJ1IjoxMTYyMjA4MywiaWF0IjoxNzI1OTQ0ODQ2LCJleHAiOjE3Mjg1MzY4NDYsImlzcyI6InB1Yi0wIiwic3ViIjoibGluay1yZWRpcmVjdCJ9.K8inMuxxvsYb5pW6GQkfeREFyjC-0MMFdK9XSepBmh8?&utm_source=substack&utm_medium=email>

READ IN APP
<https://open.substack.com/pub/astralcodexten/p/contra-deboer-on-temporal-copernicanism?utm_source=email&redirect=app-store>


Freddie deBoer has a post on what he calls “the temporal Copernican
principle.”
<https://substack.com/redirect/cef5f9da-a1c9-41f0-a2be-b13c5497299f?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>He
argues we shouldn’t expect a singularity, apocalypse, or any other
out-of-distribution thing in our lifetimes. Discussing celebrity
transhumanist Yuval Harari, he writes:

What I want to say to people like Yuval Harari is this. The modern human
species is about 250,000 years old, give or take 50,000 years depending on
who you ask. Let’s hope that it keeps going for awhile - we’ll be
conservative and say 50,000 more years of human life. So let’s just throw
out 300,000 years as the span of human existence, even though it could
easily be 500,000 or a million or more. Harari's lifespan, if he's lucky,
will probably top out at about 100 years. So: what are the odds that
Harari’s lifespan overlaps with the most important period in human history,
as he believes, given those numbers? That it overlaps with a particularly
important period of human history at all? Even if we take the conservative
estimate for the length of human existence of 300,000 years, that means
Harari’s likely lifespan is only about .33% of the entirety of human
existence. Isn’t assuming that this .33% is somehow particularly special a
very bad assumption, just from the basis of probability? And shouldn’t we
be even more skeptical given that our basic psychology gives us every
reason to overestimate the importance of our own time?

(I think there might be a math error here - 100 years out of 300,000 is
0.033%, not 0.33% - but this isn’t my main objection.)

He then condemns a wide range of people, including me, for failing to
understand this:

Some people who routinely violate the Temporal Copernican Principle include
Harari, Eliezer Yudkowsky, Sam Altman, Francis Fukuyama, Elon Musk, Clay
Shirky, Tyler Cowen, Matt Yglesias, Tom Friedman, *Scott Alexander*, every
tech company CEO, Ray Kurzweil, Robin Hanson, and many many more. I think
they should ask themselves how much of their understanding of the future
ultimately stems from a deep-seated need to believe that their times are
important because they think they themselves are important, or want to be.

I deny misunderstanding this. Freddie is wrong.

Since we don’t know when a future apocalypse might happen, we can
sanity-check ourselves by looking at past apocalyptic near-misses. The
closest that humanity has come to annihilation in the past 300,000 years
was probably the Petrov nuclear incident
<https://substack.com/redirect/74424a0a-0644-4215-bfbe-60e20977766d?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
in 1983¹, ie within Freddie’s lifetime. Pretty weird that out of 300,000
years, this would be only 41 years ago!

Maybe you’re more worried about environmental devastation than nuclear war?
The biggest climate shock of the past 300,000 years is . . . also during
Freddie’s lifetime². Man, these three-in-a-thousand coincidences keep
adding up!

“Temporal Copernicanism”, as described, fails basic sanity checks. But we
shouldn’t have even needed sanity checks as specific as these: common sense
already tells us that new apocalyptic weapons and environmental disasters
were more likely to arise during the 20th century than, say, the century
between 184,500 BC and 184,400 BC.

What’s Freddie doing wrong, and how can we do better? The following
argument is loosely based on one by Toby Ord
<https://substack.com/redirect/3f5cca3c-4d9c-4612-b64c-cc150ae2306a?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>.
Consider three types of events:

First, *those uniformly distributed across calendar time*. For example,
asteroid strikes are like this. Here Freddie is completely right: if there
are 300,000 years of human history, and you live 100 years, there’s an
0.03% chance that the biggest asteroid in human history strikes during your
lifetime. Because of this, most people who think about existential risk
don’t take asteroid strikes too seriously as a potential cause of near-term
apocalypses.

Second, *those uniformly distributed across humans*. This is what you might
use to solve Sam Bankman-Fried’s Shakespeare problem
<https://substack.com/redirect/6443b1f6-69d1-43cd-a64f-34bf1d2c3757?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
- what’s the chance that the greatest playwright in human history is alive
during a given period? Freddie sort of gets this far³, and provides a
number: 7% of humans who ever lived are alive today⁴.

Third, *those uniformly distributed across techno-economic advances*. You’d
use this to answer questions like “how likely is it that the most important
technological advance in history thus far happens during my lifetime?” This
seems like the right way to predict things like nuclear weapons, global
warming, or the singularity. But it’s harder to measure than the previous
two.

You could try using GDP growth. At the beginning of Freddie’s life, world
GDP (measured in real dollars) was about $40 trillion per year. Now it’s
about $120 trillion. So on this metric, about 66% of absolute
techno-economic progress has happened during Freddie’s lifetime. But we
might be more interested in relative techno-economic progress. That is, the
Agricultural Revolution might have increased yields from 10 bushels to 100
bushels of corn. And some new tractor design invented yesterday might
increase it from 10,000 bushels to 10,100 bushels. But that doesn’t mean
the new tractor design was more important than the Agricultural Revolution.
Here I think the right measure is log GDP growth; by this metric, about 20%
of techno-economic progress has happened during Freddie’s lifetime.

Freddie sort of starts thinking in this direction⁵, but shuts it down on
the grounds that some people think technological growth rates have slowed
down since the mid-20th century. Usually the metric that gets brought out
to support this is changes in total factor productivity, which do show the
mid-20th century as a more dynamic period than today. So fine, let’s do the
same calculation with total productivity. My impression from eyeballing this
paper
<https://substack.com/redirect/34754e4b-9ad0-4f55-b0ac-27a37d079cce?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
is that about 35% of all increase in TFP growth and 15% of all log TFP
growth has *still *happened during Freddie’s lifetime.

So what’s our prior that the most exciting single technological advance in
history thus far will happen during Freddie’s lifetime? I think a
conservative number would be 15%⁶.

How do we move from “most exciting advance in history” to questions about
the singularity or the apocalypse?

Robin Hanson cashes out “the singularity” as an economic phase change of
equal magnitude to the Agricultural or Industrial Revolutions. If we stick
to that definition, we can do a little better at predicting it: it’s a
change of a size such that it’s happened twice before. Using our previous
number, we estimate ~30% chance that such a change happens in our lifetime.

(sanity check: the last such earth-shattering change was the Industrial
Revolution, about 3 - 4 lifetimes ago.)

What about the apocalypse? This one is tougher. Freddie tries to do an
argument from absurdity: suppose the apocalypse happened tomorrow. Wouldn’t
it be crazy that, you, of all the humans who have ever existed, were
correct when you thought the apocalypse was nigh? No, it’s not crazy at
all. If the apocalypse happens tomorrow, then 7% of humans throughout
history would have been right to predict an apocalypse in their lifetime.
That’s not a such a low percent - your probability of being born in the
final generation is about the same as (eg) your probability of being born
in North America.

Here’s a question I don’t know how to answer - the number above (7%) is
about how surprised you should be if the apocalypse happens in your
lifetime. But I don’t think it’s the *overall* *chance *that the apocalypse
happens in your lifetime, because the apocalypse could be millions of years
away, after there had been trillions of humans, and then retroactively it
would seem much less likely that the apocalypse happened during the 21st
century. So: is it possible to calculate this chance? I think there ought
to be a way to leverage the Carter Doomsday Argument
<https://substack.com/redirect/9c05f457-37eb-46f2-ae96-f518cc1a6cab?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
here, but I’m not quite sure of the details.

Speaking of the Carter Doomsday Argument…
<https://substack.com/redirect/1d9028bf-eb19-4d3b-9e90-c0bec6e6f4dd?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>

…Freddie is re-inventing anthropic reasoning
<https://substack.com/redirect/34100743-66d3-4438-b4a1-e11ebab2994b?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>,
a well-known philosophical concept. The reason why the hundreds of
academics who have written books and papers about anthropics have never
noticed that it disproves transhumanism and the singularity is because
Freddie’s version has obvious mistakes that a sophomore philosophy student
would know better than to make.

(local Substacker Bentham’s Bulldog
<https://substack.com/redirect/13708587-2da4-4e60-aa0c-14a4080d8a7e?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
is a sophomore philosophy student, and his anthropics mistakes are much
more interesting
<https://substack.com/redirect/d2e31e01-f6b1-437a-8133-c7207895a39c?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
.)

The world’s leading expert on anthropic reasoning is probably Oxford
philosophy professor Nick Bostrom, who literally wrote the book
<https://substack.com/redirect/ca06a18a-7fd6-4014-8d88-6b64639234d1?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
on the subject. Awkwardly for Freddie, Bostrom is also one of the founders
of the modern singularity movement. This is because, understood correctly,
anthropics provides no argument against a singularity or any other
transhumanist idea, and might (weakly) support them.

I think if you use anthropic reasoning correctly, you end up with a prior
probability of something like 30% that the singularity (defined as a
technological revolution as momentous as agriculture or industry) happens⁷
during your lifetime, and a smaller percent that I’m not sure about (maybe
7%?) that the apocalypse happens during your lifetime. None of these
probabilities are lower than the probability that you’re born in North
America, so people should stop acting like they’re so small as to be absurd
or impossible.

But also, prior probabilities are easy-come, easy-go
<https://substack.com/redirect/f7f602c0-d26f-4b2f-9fd2-7fa1c95bd744?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>.
The prior probability that you’re born in Los Angeles is only 0.05%. But if
you look out your maternity ward window and see the Hollywood sign, ditch
that number immediately and update to near certainty. No part of anthropics
should be able to prevent you from updating on your observations about the
world around you, and on your common sense.

(except maybe the part about how you’re in a simulation
<https://substack.com/redirect/cb4d7cfb-79b7-4a7b-ae62-13e9bd089857?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>,
or the part about how there’s definitely a God who created an infinite
number of universes
<https://substack.com/redirect/bfe7d765-4ba6-4061-8522-f842172752f9?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>,
or how there must be thousands of US states
<https://substack.com/redirect/83bf8e5a-d8fc-455c-a0f1-252972d34574?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>,
or how the world must end before 10,000 AD
<https://substack.com/redirect/9c05f457-37eb-46f2-ae96-f518cc1a6cab?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>,
or how the Biblical Adam could use his reproductive decisions as a shortcut
to supercomputation
<https://substack.com/redirect/5ea8df50-f5c3-41e3-ab80-9ca87dd330de?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>,
or several other things along these same lines. I actually hate anthropic
reasoning. I just think that if you’re going to do it, you should do it
right.)
1

The Toba supervolcano is over-rated.
<https://substack.com/redirect/83e1d376-0d44-468e-b117-a4c8d3babf86?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>
You could argue Cuban Missile Crisis was worse than Petrov, but that just
brings us back 60 years instead of 40, which I think still proves my point.
2

Something called “the Eemian
<https://substack.com/redirect/c93534c0-a4c1-466b-acfa-c194b631245c?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>”
130,000 years ago was larger in magnitude, but happened gradually over
several thousand years.
3

If he got this far halfway down, why did he even present the
obviously-wrong 0.03% number as his headline result? Was he hoping we
wouldn’t read the rest of his post?
4

This is slightly wrong for the exact framing of the question; your life is
a span rather than a point, so probably by the time you die, about 10% of
humans will have been alive during your lifespan. The exact way you think
about this depends on how old you are, and I’ll stick with the 7% number
for the rest of the essay.
5

Again, I don’t understand why he bothered giving the earlier
obviously-wrong-for-this-problem numbers, vaguely half-alluded to the
existence of this one in order to complain that someone could miscalculate
it, and then put no effort into calculating it correctly or at least
admitting that he couldn’t calculate the number that mattered.
6

Some of these numbers depend on how you’re thinking of “lifespan” vs.
“lifespan so far” and how much of your actually-existing foreknowledge
about the part of your life you’ve already lived you’re using. I’m just
going to handwave all of that away since it depends on how you’re framing
the question and doesn’t change results by more than a factor of two or
three.
7

Realistically the Agricultural and Industrial Revolutions were long
processes instead of point events. I think the singularity will be shorter
(just as the Industrial Revolution was shorter than the Agricultural), but
if this bothers you, imagine we’re talking about the start (or peak) of
each.


Like
<https://substack.com/app-link/post?publication_id=89120&post_id=148609720&utm_source=substack&isFreemail=true&submitLike=true&token=eyJ1c2VyX2lkIjoxMTYyMjA4MywicG9zdF9pZCI6MTQ4NjA5NzIwLCJyZWFjdGlvbiI6IuKdpCIsImlhdCI6MTcyNTk0NDg0NiwiZXhwIjoxNzI4NTM2ODQ2LCJpc3MiOiJwdWItODkxMjAiLCJzdWIiOiJyZWFjdGlvbiJ9.A7nBeQAGTVGcUn8co4-MIcVJG-bl_wwUsY5JEaoMV5s&utm_medium=email&utm_campaign=email-reaction&r=6x3nn>
Comment
<https://substack.com/app-link/post?publication_id=89120&post_id=148609720&utm_source=substack&utm_medium=email&isFreemail=true&comments=true&token=eyJ1c2VyX2lkIjoxMTYyMjA4MywicG9zdF9pZCI6MTQ4NjA5NzIwLCJpYXQiOjE3MjU5NDQ4NDYsImV4cCI6MTcyODUzNjg0NiwiaXNzIjoicHViLTg5MTIwIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.g9jUvWBTggt0Tqa-uAfvxa-xK8K8swnma8ld0bdm3CE&r=6x3nn&utm_campaign=email-half-magic-comments&action=post-comment&utm_source=substack&utm_medium=email>
Restack
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9vcGVuLnN1YnN0YWNrLmNvbS9wdWIvYXN0cmFsY29kZXh0ZW4vcC9jb250cmEtZGVib2VyLW9uLXRlbXBvcmFsLWNvcGVybmljYW5pc20_dXRtX3NvdXJjZT1zdWJzdGFjayZ1dG1fbWVkaXVtPWVtYWlsJnV0bV9jYW1wYWlnbj1lbWFpbC1yZXN0YWNrLWNvbW1lbnQmYWN0aW9uPXJlc3RhY2stY29tbWVudCZyPTZ4M25uJnRva2VuPWV5SjFjMlZ5WDJsa0lqb3hNVFl5TWpBNE15d2ljRzl6ZEY5cFpDSTZNVFE0TmpBNU56SXdMQ0pwWVhRaU9qRTNNalU1TkRRNE5EWXNJbVY0Y0NJNk1UY3lPRFV6TmpnME5pd2lhWE56SWpvaWNIVmlMVGc1TVRJd0lpd2ljM1ZpSWpvaWNHOXpkQzF5WldGamRHbHZiaUo5Lmc5alV2V0JUZ2d0MFRxYS11QWZ2eGEteEs4Szhzd25tYThsZDBiZG0zQ0UiLCJwIjoxNDg2MDk3MjAsInMiOjg5MTIwLCJmIjp0cnVlLCJ1IjoxMTYyMjA4MywiaWF0IjoxNzI1OTQ0ODQ2LCJleHAiOjE3Mjg1MzY4NDYsImlzcyI6InB1Yi0wIiwic3ViIjoibGluay1yZWRpcmVjdCJ9.K8inMuxxvsYb5pW6GQkfeREFyjC-0MMFdK9XSepBmh8?&utm_source=substack&utm_medium=email>


© 2024 Scott Alexander
548 Market Street PMB 72296, San Francisco, CA 94104
Unsubscribe
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly93d3cuYXN0cmFsY29kZXh0ZW4uY29tL2FjdGlvbi9kaXNhYmxlX2VtYWlsP3Rva2VuPWV5SjFjMlZ5WDJsa0lqb3hNVFl5TWpBNE15d2ljRzl6ZEY5cFpDSTZNVFE0TmpBNU56SXdMQ0pwWVhRaU9qRTNNalU1TkRRNE5EWXNJbVY0Y0NJNk1UYzFOelE0TURnME5pd2lhWE56SWpvaWNIVmlMVGc1TVRJd0lpd2ljM1ZpSWpvaVpHbHpZV0pzWlY5bGJXRnBiQ0o5LmlmRV9aR215ZUUycWhlV0JwU0NyZHlyVUExX3NCSlQ1dU51dGlxcGx4VnMiLCJwIjoxNDg2MDk3MjAsInMiOjg5MTIwLCJmIjp0cnVlLCJ1IjoxMTYyMjA4MywiaWF0IjoxNzI1OTQ0ODQ2LCJleHAiOjE3Mjg1MzY4NDYsImlzcyI6InB1Yi0wIiwic3ViIjoibGluay1yZWRpcmVjdCJ9.VUWr7dt04KsN-whamdidw7VfFGLbFXGkbTQcGYR8c2w?>

[image: Get the app]
<https://substack.com/redirect/8f3b7ce3-98ec-49a1-8038-33353cf0c017?j=eyJ1IjoiNngzbm4ifQ.I1PMvYo4mI3PquTDRhL5Dev-9_ouIq3kw6ZhrVNsy8o>[image:
Start writing]
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9zdWJzdGFjay5jb20vc2lnbnVwP3V0bV9zb3VyY2U9c3Vic3RhY2smdXRtX21lZGl1bT1lbWFpbCZ1dG1fY29udGVudD1mb290ZXImdXRtX2NhbXBhaWduPWF1dG9maWxsZWQtZm9vdGVyJmZyZWVTaWdudXBFbWFpbD1qb2hua2NsYXJrQGdtYWlsLmNvbSZyPTZ4M25uIiwicCI6MTQ4NjA5NzIwLCJzIjo4OTEyMCwiZiI6dHJ1ZSwidSI6MTE2MjIwODMsImlhdCI6MTcyNTk0NDg0NiwiZXhwIjoxNzI4NTM2ODQ2LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.pJzxzZMx2upLIUulxkunlI0vFcU_f_Hf3otGIUooq5o?>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv2ND85nxn%2B1QmTXipFuSvwUcBHom8BGidtUiSeAZWM63g%40mail.gmail.com.

Reply via email to