On 04 Jun 2016, at 03:17, Russell Standish wrote:
On Tue, May 31, 2016 at 05:04:48PM +0200, Bruno Marchal wrote:
On 31 May 2016, at 02:33, Russell Standish wrote:
Hmm - the "output" of the UD (ie UD*) is a very low complexity
object. The complexity you refer to is actually UD* seen from the
inside by a computationlist observer. That complexity has indeed
arisen through an evolutionary process: mutation via the FPI,
?
FPI is a random process due to only being able to observe a single
branch within a branching process. Mutation is exactly the same,
particularly as seen from the inside of a multiverse. This argument is
made in my paper "Evolution in the Multiverse".
The FPI is more a personal event (or sequence of personal events) than
a process. The process to which we can attached the random experiences
is itself non random (just a 3p duplication).
But a mutation is a process. Now, if you assume it is a purely quantum
process, then you can use the classical FPI to explain the perception
of evolution (in the multiverse). OK.
The counting algorithm produces a simple object. Complexity is
generated by selecting some subset of that simple object, and it is
the selection which creates the complexity.
Like in quantum mechanics. But the complexity must be present before
we can observe it, and it is mathematically present, like all
branches of Everett Universal Wave.
Why? The opposite clearly appears to be the case. Mathematically, the
UD*, or the Multiverse, are simple objects. Hence their appeal under
Occam's razor.
Simple as a whole (like the everything idea), but it generates all the
complexities from inside. That generation is out-of-time, and the
complexity is there out of time too.
To restate above, you are confusing the complexity observed by a
putative internal observer (which by computationalism assumption
must
exist), and the complexity of the UD*. The former is generated by an
evolutionary process, and high, the latter is low (being equal to
the
KCS complexity of the UD).
I just use Blum complexity. It exists without the introduction of
I had a look at Blum complexity from
https://en.wikipedia.org/wiki/Blum_axioms
----------------------------------------------
A Blum complexity measure is a tuple ( φ , Φ ) {\displaystyle
(\varphi ,\Phi )} with φ {\displaystyle \varphi } a Gödel numbering
of the partial computable functions P ( 1 ) {\displaystyle \mathbf
{P} ^{(1)}} and a computable function
Φ : N → P ( 1 ) {\displaystyle \Phi :\mathbb {N} \to \mathbf
{P} ^{(1)}}
which satisfies the following Blum axioms. We write φ i
{\displaystyle \varphi _{i}} for the i-th partial computable
function under the Gödel numbering φ {\displaystyle \varphi } , and
Φ i {\displaystyle \Phi _{i}} for the partial computable function Φ
( i ) {\displaystyle \Phi (i)} .
the domains of φ i {\displaystyle \varphi _{i}} and Φ i
{\displaystyle \Phi _{i}} are identical.
the set { ( i , x , t ) ∈ N 3 | Φ i ( x ) = t } {\displaystyle \
{(i,x,t)\in \mathbb {N} ^{3}|\Phi _{i}(x)=t\}} is recursive.
----------------------------------------------
Note the key things: Blum complexity is a tuple of functions, one of
which maps programs to integers, and the other maps integers to
programs. This is a far, far cry from the usual notion of complexity
(even structural complexity) which attaches a numerical value to an
object.
That said, I don't really understand how their examples (time and
space complexity, which are very different beasts from structural
complexity) fit the definitions given, so maybe there are some
problems with the Wikipedia article.
Blum complexity is really the second phi in the {phi_i, PHI_i} above.
The first one are just the phi_i. The article is unnecessarily complex
(no pun intended). A Blum measure is any computable predicate on the
phi_i and the inputs. The two more used are the time to do a
computation (the predicate which depends on i (phi_i), x and t. What
need to be able to be answered recursively (by yes or no) is something
like did the program i with input x stop at time t (or before time t,
not after!).
The relation between phi_i and PHI_i is comparable to the realtion
between Gödel's the non decidable beweisbar (Ey(y is a proof of x)),
and simply the decidable 2-ary predicate (y is a proof of x). ("y is a
proof ..." abbreviates "y is a number description of a proof ...}.
But Blum's theory works with any decidable predicate, like the number
of time this or that words is used, or the number of memory-
allocation, etc.
any observer. Also, in the UD, there is no mutation, and no
Darwinian selection.
I already addressed this. Once you admit the presence of a
computationlist observer (a "view from the inside"), you have both
mutation (via FPI)
I can understand, but that is misleading. A mutation is usually
thought as a change in a program. The UD just generates (and run) the
two different programs (and also mutations, but at a different level).
and selection (anthropic selection). The only thing
left is heredity to make up the trimuvirate of evolutionary criteria,
and that is covered by the "consistent extension" aspect of future
observer moments, something you have insisted upon in the past.
Yes, the no-cul-de-sac rule, which transform []p into []p & <>t do
something like that. We abstract ourselves from all realities where we
do not belong or where we are dead.
Only a measure on the existing computations.
Of course, all this list is based on the idea that the overall
theory should be simple (like RA), but even without notion of
observers, such simple theory admit a rich third person theory of
complexity.
I still don't see it.
You can prove the existence of the observer in the theory RA. It is
the proof of the existence of the universal machine and of their
computations. RA is already sigma_1 complete. If ZF proves the Riemann
hypothesis, then RA will prove that ZF proves Riemann hypothesis
(despite RA itself cannot even prove that 0 + x = x).
We don't need to assume more than RA or the SK axioms. 3p Observers
are defined in such theories. They are richer and more complex than RA.
Bruno
The point is that such complexity can be derived from the elementary
assumptions, and we don't have to invoke physics or evolutionary
biology to get its existence. We might need to evoke evolutionary
biology to justify *our* (human) apprehension of it.
Bruno
--
----------------------------------------------------------------------------
Dr Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Senior Research Fellow [email protected]
Economics, Kingston University http://www.hpcoders.com.au
----------------------------------------------------------------------------
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it,
send an email to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.
http://iridia.ulb.ac.be/~marchal/
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.