On 26 Feb 2011, at 19:50, Pzomby wrote:

On Feb 21, 9:11 am, Bruno Marchal <marc...@ulb.ac.be> wrote:
On 21 Feb 2011, at 13:26, benjayk wrote:

Bruno Marchal wrote:

On 20 Feb 2011, at 00:39, benjayk wrote:

Bruno Marchal wrote:

Isn't it enough to say everything that we *could* describe
in mathematics exists "in platonia"?

The problem is that we can describe much more things than the one
are able to show consistent, so if you allow what we could describe you take too much. If you define Platonia by all consistent things,
you get something inconsistent due to paradox similar to Russell
paradox or St-Thomas paradox with omniscience and omnipotence.
Why can inconsistent descriptions not refer to an existing object?
The easy way is to assume inconsistent descriptions are merely an
combination of symbols that fail to describe something in particular
thus have only the "content" that every utterance has by virtue of
uttered: There exists ... (something).

So they don't add anything to platonia because they merely assert
existence of existence, which leaves platonia as described by

I think the paradox is a linguistic paradox and it poses really no
Ultimately all descriptions refer to an existing object, but some
are too
broad or "explosive" or vague to be of any (formal) use.

I may describe a system that is equal to standard arithmetics but
also has
1=2 as an axiom. This makes it useless practically (or so I
guess...) but it
may still be interpreted in a way that it makes sense. 1=2 may mean
there is 1 object that is 2 two objects, so it simply asserts the
of the one number "two".

But what is two if 2 = 1. I can no more have clue of what you mean.
Two is the successor of one. You obviously now what that means.

So keep this meaning and reconcile it with 2=1.
You might get the meaning "two is the one (number) that is the
succesor of
one". Or "one (number) is the successor of two". In essence it
2*...=1*... or 2*X=1*Y.
And it might mean "the succesor of one number is the succesor of the
succesor of one number". or 2+...=1+... or 2+X=1+Y.

The reason that it is not a good idea to define 2=1 is because it
express something that can't be expressed in standard arithmetic,
but it
makes everything much more confusing and redundant. In mathematics
we want
to be precise as possible so it's good rule to always have to
specifiy which
quantity we talk about, so that we avoid talking about something -
that is
one thing - that is something - that is two things - but rather talk
one thing and two things directly; because it is already clear that
things are a thing.


Bruno Marchal wrote:

Now, just recall that "Platonia" is based on classical logic where
falsity f, or 0 = 1, entails all proposition. So if you insist to say
that 0 = 1, I will soon prove that you owe to me A billions of
dollars, and that you should prepare the check.
You could prove that, but what is really meant by that is another
It may simply mean "I want to play a joke on you".

All statements are open to interpretation, I don't think we can
avoid that
entirely. We are ususally more interested in the statements that are
vague, but vague or crazy statements are still valid on some level
though often on an very boring, because trivial, level; like saying
"S afs
fdsLfs", which is just expressing that something exists).

We formalize things, or make them as formal as possible, when we
search where we disagree, or when we want to find a mistake. The idea
of making things formal, like in first order logic, is to be able to
follow a derivation or an argument in a way which does not depend on
any interpretation, other than the procedural inference rule.

Bruno Marchal wrote:

3=7 may mean that there are 3 objects that are 7
objects which might be interpreted as aserting the existence of (for
example) 7*1, 7*2 and 7*3.

Logicians and mathematicians are more simple minded than that, and it
does not always help to be understood.
If you allow circles with edges, and triangles with four sides in
Platonia, we will loose any hope of understanding each other.
I don't think we have "disallow" circles with edges, and triangles
with four
sides; it is enough if we keep in mind that it is useful to use
words in a
sense that is commonly understood.

That is why I limit myself for the TOE to natural numbers and their
addition and multiplication.
The reason is that it is enough, by comp, and nobody (except perhaps
some philosophers) have any problem with that.

Yes.  A couple of questions from a philosophical point of view:

Language gives meaning to the numbers as in their operations;
functions, units of measurements (kilo, meter, ounce, kelvin etc.).

I am not sure language gives meaning. Language have meaning, but I think meaning, sense, and reference are more primary. With the mechanist assumption, meaning sense and references will be 'explained' by what the numbers 'thinks' about that, in the manner of computer science (which can be seen as a branch of number theory).

Numbers alone may symbolize some fundamental describable matter and
forces but a complete and coherent TOE should include elevated human
consciousness beyond the primitive which in itself requires a
relatively sophisticated language to give meaning to the numbers and
their operations.

Hmm... You can use numbers to symbolize things, by coding, addresses, etc. But numbers constitutes a reality per se, more or less captured (incompletely) by some theories (language, axioms, proof technics, ...). In this context, that might be important.

Would not any TOE describing the universe appears to require human
sophisticated language using referent nouns, (and conjunctions,
adjectives and verbs etc.) to give meaning to the numbers and their
functions and operations?

With the mechanist assumption, humans and their language will be described by machine operations, which will corresponds to a collection of numbers relations (definable with addition and multiplication). This is not obvious and relies in great part of the progress of mathematical logic.

You repeatedly refer to “addition and multiplication”.  Is not
multiplication repeated addition or is there another separate
principle involved with multiplication?

This is a very technical point. It can be shown that classical first order logic+addition gives a theory too much weak to be able to defined multiplication or even the idea of repeating an operation a certain arbitrary finite number of time. Likewise it is possible to make a theory of multiplication, and then addition is not definable in it. The pure addition theory is known as Pressburger arithmetic, and has been shown complete (it proves all the true sentences *expressible* in its language, thus without multiplication symbols); and decidable, unlike the usual Robinson or Peano Arithmetic, with + and *, which are incomplete and undecidable. Once you have the naturals numbers and both addition and multiplication, you get already (Turing) universality, and thus incompleteness, insolubility.



You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to