At 13:56 -0700 2002/01/16, Scott Raney wrote: >Natural language has always been a major headache for computer-science >types specifically because of these kinds of token ambiguities. But I >wouldn't go so far as to say HT is an example of poor design (well, it >does have its flaws, but this kind of multi-use token isn't one of >them). It's just much more English-like than is convenient for >language implementers ;-)
One should remember that the formulas of mathematics were successfully developed from natural language expressions. One simply found formulas less prone to ambiguous interpretation, and more easily parsed, as they are more compact. For example, one could say "the function sine of the variable x", but it is more compact to write "sin x" with "x" in italics to indicate it is a variable, and "sin" in upright type to indicate it is a constant. So in this craze to just use plain English, one is also skipping over the benefits of formulas and computer languages. One idea that comes to my mind is that you invent a traditional computer language with its logical accuracy, but on the same time ensures it has proper English constructs. Then an editor tool could translate the English into the computer language, and someone that wants to see the English version could apply the tool to the computer code. I can see this before my very eyes: a magnifying glass or "crystal ball" that one holds over the computer language version, and displays the English version. Then one can use the English version to naturally start developing in the language, but also naturally pass to the more compact computer language version. Hans Aberg _______________________________________________ Freecard-general mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/freecard-general