Otto Stolz wrote: [...] > No, I simply wish that they had based their design on correct sup- > positions. A real-world value (physical, monetary, or whatever) is > not a mere number, but rather the product of a number and a unit of > measurement. Changing the unit while keeping the number will affect > the value no less than changing the number while keeping the unit. > Example: > - when you change the width of a diskette from 90 mm to 2286 mm, > it will no more fit in the drive, > - likewise, when you change its width from 90 mm to 90", > it will no more fit in the drive, > - both 90 mm, and (approximately) 3½" wide diskettes will fit in > the same drive. > > A value in a spreadsheet should never inadvertently change. > > Now, Works did exactly that: it multiplied every monetary value > (in Germany by 1,95583) by replacing its unit with a different > one, [...]
A perfect explanation of the nature problem, but a poor diagnosis of its location. Your description makes clear why, in the retail software industry, we *never* use some the two elements of "monetary settings" those which define the actual monetary value of a number. But you fail to notice that the problem is is not with Works, or with any other applicative program, but rather with the system-level locale settings themselves. And, unfortunately, all the operating systems and the programming languages that I know suffer from this problem. There are two wrong design assumption in the "monetary settings": 1) The <number of decimals> should not be part of the locale at all. The fact that US dollars need two decimals does not depend on local convention of any country, but on the fact that there are coins as small as 1/100 of a dollar. There should be a locale-independent list of currencies that specifies this parameter (and other things, perhaps)- 2) The <currency symbol> is to be part of the locale, but each locale should have an *array* of currency symbols, with an entry for each existing currency. Having a single <currency symbol> per locale would be like having one single <month name>! E.g.: an United States locale could have "$" as the symbol for dollars and "Mex$" as the symbol for the Mexican peso, while the Mexican locale could have "$EE.UU." for the former and "$" for the latter. However, US dollars remain US dollars in both countries, just like "January" and "enero" remain month #1 in both countries. So, it is perfectly OK if 1234 dollars show as "$1,234.00" in the USA and as "1.234,00 $EE.UU." in Mexico, but it is never OK 1234 dollars should become 1234 pesos! (But the opposite is OK, if you do it on my bank account:-) This implies, as you argued, that the monetary values should contain a currency Id. Storing money as a simple number is like storing dates with no year, or integers with no sign. It also implies that all user interfaces should be changed to allow users to specify the currecy Id, in case that it is different from the default. This also makes it desirable that most systems have some form of exchange() API or service, which is capable of converting monetary amounts from one currency to another. But this opens the problem that, unlike measurement units, the value exchange rates fluctuate daily, so there should be some mechanism to update them. I don't agree, however, that the actual formatted string should be stored with the amount. If you change the symbol of the German mark from "DEM" to "DM", it is OK that amounts previously formatted as "DEM 10,50" become "DM 10,50". But if you add a *new* currency Id, say <euro>, and set it as the system default, an existing "DEM 10,50" amount should remain unchanged. The only effect should be that, if you enter a new amount "10.5" without specifying the currency, it will be "EUR 10,50". _ Marco

