On 14 Feb 2012, at 04:00, Stephen P. King wrote:
On 2/13/2012 5:54 PM, acw wrote:
On 2/12/2012 17:29, Stephen P. King wrote:
I would like to bring the following to your attention. I think
do need to revisit this problem.
The Anthropic Trilemma
I gave a tentative (and likely wrong) possible solution to it in
another thread. The trillema is much lessened if one considers a
relative measure on histories (chains of OMs) and their length.
That is, if a branch has more OMs, it should be more likely.
The first horn doesn't apply because you'd have to keep the copies
running indefinitely (merging won't work).
The second horn, I'm not so sure if it's avoided: COMP-immortality
implies potentially infinite histories (although mergers may make
them finite), which makes formalizing my idea not trivial.
The third horn only applies to ASSA, not RSSA (implicit in COMP).
The fourth horn is acceptable to me, we can't really deny Boltzmann
brains, but they shouldn't be that important as the experience
isn't spatially located anyway(MGA). The white rabbit problem is
more of a worry in COMP than this horn.
The fifth horn is interesting, but also the most difficult to
solve: it would require deriving local physics from COMP.
My solution doesn't really solve the first horn though, it just
makes it more difficult: if you do happen to make 3^^^3 copies of
yourself in the future and they live very different and long lives,
that might make it more likely that you end up with a continuation
in such a future, however making copies and merging them shortly
afterwards won't work.
This solution only will work for finite and very special versions
of infinite sets. For the infinities like that of the Integers, it
will not work because any proper subset of the infinite set is
identical to the complete set as we can demonstrated with a one-to-
one map between the odd integers and the integers.
You should not confuse bijection (set isomorphism) and equality. Also,
measure exists on infinite discrete sets, by weakening the sigma-
additivity constraints. And then, finally, the measure problem bears
on infinite extension of computations, and they are 2^aleph_0.
Remember the one line UD program:
For all i, j,k compute the kth first steps of phi_i(j).
We can describe a computation a sequence phi_i(j)^0,
phi_i(j)^1, .... , phi_i(j)^k.
That set is enumerable, but the set of all sequences going through
equivalent 1p-steps is not enumerable, and you can define a measure by
just using the normal distribution in a manner similar to the
dovetailing on the reals. This has just to be corrected to take into
account the constraints of self-reference, which seems to be the
origin of an arithmetical quantization, negative amplitude of
Given that the number of computations that a universal TM can run
is at least the countable infinity of the integers, we cannot use a
comparison procedure to define the measure.
You confuse the computations made by the UD, and observed by an
outsider, and the infinite computations going through your actual 1p-
state. Those includes all the dummies dovetailing on the reals, and
cannot be enumerable.
Think about the iterated self-duplication. It leads to the usual
(Maybe this is one of the reasons many very smart people have tried,
unsuccessfully, to ban infinite sets...)
Not al all. The infinite set have been introduced to make the measure
problem more easy, even for problem handling finite objects when they
are very numerous.
Mathematical logic explains that finite and enumerable is more complex
than the continuum, which existence is basically motivated by
searching to simplify the problem. For example, Fermat on the reals is
trivial. Not so on non negative integers.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at