> assign formatting conventions to any that are sufficiently rare
> not to have defaults.

Defaults. My experience is that it is impossible to use defaults.

Getting a nontrivial paper down in print is often a juggling act.

"If I call these parameters a,b,c,d then I cannot use d as the
dimension of that vector space. Hmm. Let us call them alpha, beta,
gamma, delta. Hmm, no, delta is already the discrepancy.
Let us take p,q,r,s for the parameters, and d for the dimension.
No, I also need a distance d. OK. Take n for the dimension.
Then n is not the total number of points, let us take N for that.
No, N is the point-block incidence matrix. Take v for the total
number of points, and then the vectors u,v,w can be called x,y,z, ..."

For each concept there are a few common notations, and one tries
to shuffle symbols around in an attempt to get for every object
some symbol that is "reasonable" for that object, and such that
similar objects get similar symbols.
(a set X may have elements x,y,z but preferably not b, rho, Aleph)

This shuffling usually fails, there are just too few symbols
available, and then one creates a prime, a bar, a tilde, bold a,
script a, fraktur a, a sub 1, a sup 1, etc.

You see that the choice of representation of an object depends
strongly on what other things occur nearby - markup here does
not help at all, a mathematician really wants to specify
things in greater detail than one usually does in text markup.

[Often a mathematician needs to know: are 1 and l sufficiently
different so that I can use l as a variable? Is the font such
that I can use both x prime and x sup 1? The idea of markup is
to make semantics independent of rendering, but today
mathematics is still far removed from that goal.]

Andries

-
Linux-UTF8:   i18n of Linux on all levels
Archive:      http://mail.nl.linux.org/lists/

Reply via email to