On Sat, 9 Mar 2013 18:23:27 +0100 Philippe Verdy <[email protected]> wrote:
> 2013/3/9 Richard Wordingham <[email protected]>: > > In a real example of such a font, how would one adjust the position > > so that U+002E is on the baseline in section numbers but raised in > > genuine decimal numbers? (This is not an idiosyncratic style.) > > In fact I would have even thought about the reverse : > > - section numbers delimited/terminated by full stops *above* the > baseline (possibly using the MIDDLE DOT, without any other extra > spaces such as "1·1·2· Section Heading"), but > > - decimal dots still being on the baseline (3.10, standard English > style) That's *not* what I was taught to do in school. I'm English, not American. In the right context, what looks like 3.10 can actually mean 3×10 in British English! The proper encoding of this product is yet another issue. > Do you mean higher than baseline but still lower than the middle dot ? The middle dot glyph would do fine. However, I understand that one should not use U+00B7 MIDDLE DOT to represent the raised decimal point. U+00B7 does not have the right properties. Therefore, I want to understand how one uses U+002E to encode it and still have it rendered properly given the right font for, say, mid 20th century British English text. It's just conceivable that no such font existed yesterday, and I strongly suspect I don't have any such font available for little extra cost. > What I've seen instead is a locale-neutral decimal separator > consisting in a small vertical line **hanging** from the baseline, > which would notamlly never be confused for something else... How many of these cannot be encoded in Unicode? Richard.

