Hi ACW,

On 2/4/2012 1:53 PM, acw wrote:


One can wonder what is the most "general" theory that we can postulate to explain our existence. Tegmark postulates all of consistent mathematics, whatever that is, but is 'all of consistent mathematics' consistent in itself?

I have read several papers that argue strongly that it cannot be! For instance see: http://arxiv.org/abs/0904.0342 The fact that there are set theories that use axioms that are completely opposite each other is another strong indication of this.

Schmidhuber postulates something much less, just the UD, but strangely forgets the first-person or the what the implementation substrate of that UD would be (and resorts to a Great Programmer to hand-wave it away).

    I wonder why Schmidhuber held back? Did he fear ridicule?

Before reading the UDA, I used to think that something like Tegmark's solution would be general enough and sufficient, but now I think 'just arithmetic' (or combinators, or lambda calculus, or ...) or is sufficient. Why? By the Church-Turing Thesis, these systems posses the same computability power, that is, they all can run the UD.

I agree with this line of reasoning, but I see no upper bound on mathematics since I take Cantor's results as "real". There is not upper bound on the cardinality of Mathematics. I see this as an implication of the old dictum "Nature explores all possibilities."

Now, if we do admit a digital substitution, all that we can experience is already contained within the UD, including the worlds where we find a physical world with us having a physical body/brain (which exist computationally, but let us not forget that random oracle that comes with 1p indeterminacy).

Not quite, admitting digital substitution does not necessarily admit to pre-specifiability as is assumed in the definition of the algorithms of Universal Turing machines, <http://en.wikipedia.org/wiki/Algorithm> it just assumes that we can substitute functionally equivalent components. Functional equivalence does not free us from the prison of the flesh, it merely frees us from the prison of just one particular body. ;-) This idea goes back to my claim that the "Pre-established harmony <http://en.wikipedia.org/wiki/Pre-established_harmony>" idea of Leibniz is false because it requires the computation of an infinite NP-Complete problem to occur in zero steps. As we know, given even infinite resources a UTM must take at least one computational step to solve such a NP-Complete problem. My solution to this dilemma is to have an eternally running process at some primitive level. Bruno seems to identify this with the UD, but I claim that he goes too far and eliminates the "becoming" nature of the process.

If we are machines, then we can only experience finite amount of information given some finite interval of time, some of this information may be incompressible, due to 1p indeterminacy, thus we could experience "reals" in the limit, despite there only being finite computations at any given time. This essentially means that any mathematical object which can be described in Tegmark's "Ultimate Ensemble" and that can contain us, is already part of the 1p experiences of those existing within the UD and we can look at 1p experiences, as well as the UD* trace as being part of the greater "arithmetical" truth (or any other theory with equivalent computational power, by the Church-Turing Thesis).

Umm, we have to show that the finiteness of machines is necessary from first principles, we cannot just assume that it is so. I agree that the "arithmetical truth" of the UD may be enough to "force" the 1p to have content, but we still need to account for the appearance of interactions or histories of interactions (ala Julian Barbour'sTime Capsule <http://en.wikipedia.org/wiki/Julian_Barbour> idea). There reaches a point, even if it is in the limit of infinitely many, that we cannot put off the concurrency problem, we have to deal with interactions. An option is to take the "running of the UD" as a primitive kind of dynamic that at our local 1p emerges as time and notions of forces, fields, etc. emerge from the algebras of interactions between the many distinct 1p.


This is why I think "arithmetic" is as good as any for a neutral foundation, and we cannot really distinguish (from the inside) between these foundations by the CTT.

This does not address the neutrality problem though. How can the foundation be neutral if it is biased toward a particular structure, even if it is as elegant as arithmetic? My point is that whatever foundation we take, within our ontological theories, it must be neutral with respect to a basis, reference frame, grammar or any other structure that would break its perfect symmetry. Nature does not respect any privileged framing what so ever and thus there cannot be a privileged observational stance. This stance toward neutrality may seem unusually strong, but I don't see how it can be any other way, even allowing arithmetic to be a primitive is to allow a bias against non-arithmetical structures and any bias, however weak, is still a rupture of neutrality.

However, there might be other possible foundations, if you wish to postulate concrete infinities, but even if they existed, how could we tell them apart, it doesn't seem to be possible for someone admitting a digital substitution, which has a finite mind (at any finite point in time). If you can show that those other foundations are necessary and they affect our measure/continuations, or that concrete infinities are involved in the implementation of our brain, it could prove COMP wrong.

The Dualism that follows the analogy of the Stone duality covers this question. Boolean algebras have a specific kind of topological space as their dual. It is forced and as such there is a direct and predictable link between the behavior of the logic and the behavior of the dual space. Is it a complete accident that the topological space that is the dual to Boolean algebras looks like a collection of primitive atoms <http://en.wikipedia.org/wiki/Atomism> in a void? I don't think so! So if the logic that observers are limited to is required to be representable in terms (up to isomorphism) with Boolean algebras, then the physical world that those logical entities have as 1p must look like "atoms in a void". No wonder our particle physics works so well! There is more to add to this, such as the Pontryagin duality that expands the class of dual spaces out to range between the discrete spaces to the compact spaces, but that is for another conversation. :-)

There is another problem with taking a set theory as foundational rather than arithmetic - some set theories have independent axioms and they can be extended by adding either an axiom or its negation, and they result in different set theoretical truths.


I didn't mean to take set theory per se as fundamental, I was thinking of set theory as just a mereology - a schemata of sorts - of how we define relations between parts and wholes. But as to your point about set theory, does not the proven existence of non-standard Arithmetic <http://en.wikipedia.org/wiki/Non-standard_model_of_arithmetic> argue the other way? While the Tennenbaum Theorem <http://en.wikipedia.org/wiki/Tennenbaum%27s_theorem> seems to make standard (ala Peano) arithmetics "special" and "unique", I strongly suspect that this is just an invariance property, similar to the invariance of the speed of light in physics: any logical entity will see its own Arithmetic model as countable and recursive, it cannot see the "constant" that would make it non-standard as such is its fixed point, its "identity" if you will. I do not have any formal description of this latter idea nor even a proof of it, so please just take this as a conjecture. ;-)

This doesn't really happen with computation - if there's anything absolute in math, it's computation (although different theories about what arithmetic is will result in different things the theory can talk about, but it won't make computation any less absolute).

I strongly suspect that your argument here about the "absoluteness" of computation is a bit too strong or even misplaced. Restricting information to only being a binary bit on mappings in the Integers is a harsh regime, no wonder computation is so "well behaved", any deviation of the bits from the tyranny of the integers at all is terminated with extreme prejudice! I see computation, in general, as "the transformation of representations" and thus do not see the by fiat confinement to the integers as beneficial.



As a side-note, I don't see why the primitive physical world is necessary, from the 1p, we can only know that we have senses and from the senses we can infer the existence of the external world.

We have the problem of other minds to deal with! That is why, among other things, we need the physical world albeit NOT primitive, the physical world allows form an "external" differentiation of 1p that would otherwise be identical by Leibniz' identity of indiscernibles. I am just claiming that the abstrac <http://en.wikipedia.org/wiki/Abstract_object>t and the concrete <http://en.wikipedia.org/wiki/Concrete_object> are always co-present at any level until we go to the limit of bare neutral existence. At that point any differences that might make a difference vanish, thus logic and spaces would cease being different yet isomorphic. Vaughn Pratt shows how this works in terms of the directions of the Arrows of the categorical representations of LOGIC and SPACE, they point in opposite directions thus if we add them up their directions and scalars would vanish. -> + <- = (see http://upload.wikimedia.org/wikipedia/commons/f/ff/Laws_of_Form_-_double_cross.gif)

Additionally, I see this conjecture as similar to Tegmark's Mathematical Universe Hypothesis <http://en.wikipedia.org/wiki/Mathematical_universe_hypothesis> except that I do not see how the postulate "/All structures that exist mathematically also exist physically."/ implies a mathematical monism as the wiki article states. If for any structure that exists mathematically there must exist a physical structure, there is the implication of a duality between the mathematical and the physical. This is a different sort of duality than that of Descartes as it does not assume distinct "substances", it is a form of dual aspect theory <http://en.wikipedia.org/wiki/Double-aspect_theory> where the dynamics of each aspect run in opposite directions. Vaughn Pratt explains the idea here: http://boole.stanford.edu/pub/dti.pdf


If consciousness is how some (possibly self-referential) arithmetical (or computational) truth feels from the inside, it does not seem impossible that there would not be computations representing some physical (just not primitive) world and that world would contain us and our bodies/brains, and the existence of such computations would be a theorem in arithmetic.


I agree, but the representation of a thing is not the thing except in very special cases, such as what we have when we say that the "best" simulation of an object is the object itself. <http://www.stephenwolfram.com/publications/articles/physics/85-undecidability/2/text.html> This takes us into a discussion of questions like "when might the map => the territory or, by duality,the territory => the map <http://chorasimilarity.wordpress.com/2011/06/21/entering-chora-the-infinitesimal-place/>? This is a subtle and important question! ;-)


Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

<<inline: ceaahihh.png>>

Reply via email to