In a message dated 2001-05-07 6:55:01 Pacific Daylight Time, 
[EMAIL PROTECTED] writes:

>  Current programming languages (C++ and others) have violated
>  what I consider good language design by overloading the same
>  glyphs for totally different uses.  
>  The most obvious is <> for brackets and operators! 

C++ was developed in the early 1980s, and was based on C, which in turn was 
developed around 1970.  At this time, it had only been a few years since 
7-bit ASCII had been invented, let alone standardized.  C was one of the 
first languages to take full, or nearly full, advantage of 7-bit ASCII.

In subsequent years, continued limitations in ASCII conformance forced C to 
add the trigraph hack, in which sequences beginning with "??" replaced some 
of the less frequently available ASCII characters.  For example, "??<" and 
"??>" stood for the curly brackets { and }, and "??'" stood for ^.  Trigraphs 
are still supported in the most modern C and C++ compilers, regardless of the 
platform's character set support, and you have to backslash-escape one of the 
question marks in a literal string that contains "??" if you don't want any 
surprises.

Any programming language that wants to avail itself of the rich set of 
punctuation, brackets, and other symbols found in Unicode must have at least 
the following features:

1.  Commonly used symbols *must* be directly available on virtually all 
Latin-script keyboards, not just by typing convoluted dead-key or 
Alt-sequences.

2.  Symbols must be easy to distinguish from each other, not just in a 
professionally designed font but in ordinary handwriting, to prevent 
confusion.

-Doug Ewell
 Fullerton, California

Reply via email to