On 07 Sep 2012, at 13:39, Stephen P. King wrote:
On 9/7/2012 3:14 AM, Bruno Marchal wrote:
But you claim that too, as matter is not primitive. or you lost me
I need matter to communicate with you, but that matter is explained
in comp as a a persistent relational entity, so I don't see the
problem. It is necessary in the sense that it is implied by the
comp hypothesis, even constructively (making comp testable). It is
even more stable and "solid" than anything we might extrapolate
from observation, as we might be dreaming. Indeed it comes from the
atemporal ultra-stable relations between numbers, that you recently
mention as not created by man (I am very glad :).
Matter is not primitive as it is not irreducible. My claim is
that matter is, explained very crudely, patterns of invariances for
some collection of inter-communicating observers (where an observer
can be merely a photon detector that records its states).
OK, except that we have no photon at the start.
This is not contradictory to your explanation of it as "persistent
relational entity", but my definition is very explicit about the
requirements that give rise to the "persistent relations". I
believe that these might be second order relations between
computational streams. and can be defined in terms of bisimulation
relations between streams.
You might try to relate this with the UDA consequences.
I question the very idea of "atemporal ultra-stable relations
between numbers" since numbers cannot be considered consistently as
just entities that correspond to 0, 1, 2, 3, ... We have to consider
all possible denotations of the signified.
I think this is deeply flawed. Notion of denotations and set of
denotations, are more complex that the notion of numbers.
for an explanation. Additionally, there are not just a single type
of number as there is a dependence on the model of arithmetic that
one is using.
Outside arithmetic. This use the intuitive notion of numbers, even
second order arithmetic. This is explained, through comp, as construct
For example Robinson Arithmetic and Peano Arithmetic do not define
the same numbers.
Of course they do. RA has more model than PA, but we use the theory
with the intended model in mind, relying on our intuition of numbers,
not on any theory. No one ever interpret a number in the sense of a
non standard numbers. That would make comp quite fuzzy. Nobody would
say "yes" to a doctor if he believe that he is a non standard machine/
number. You can't code them in any finite (in the standard sense!) ways.
So we have multiple signified and multiple signifiers and cannot
assume a single mapping scheme between them. I suppose that a
canonical map exists in terms of the Tennebaum theorem, but I need
to discuss this more with you to resolve my understanding of this
You do at the absic level what I suspect you to do in many post.
Escaping forward in the complexity. But to get the technical results
all you need is assessing your intuition of finite, and things like
the sequence 0, s(0), s(s(0)), etc.
Then if you agree with the definition of addition and multiplication,
everything will be OK. If not you would be like a neuroscientist
trying to define a neuron by the activity of a brain thinking about a
neuron, and you will get a complexity catastrophe.
This remark is very important. Your critics here apply to all papers
you cite. We have to agree on simple things at the start,
independently of the fact that we can't define them by simpler notion.
For the numbers, or programs, finite strings, hereditarily finite
objects, the miracle is that we do share the standard notion of it,
unlike for any other notions like set, real number, etc.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at