It is built in to lots of places.

The idiocy arises (or only should arise) with the idea of transferring of 
"non-character" data between systems - packed-decimal, binary, even, for 
supreme idiocy, floating-point - so that a "field-level" translation is 
required of the "character" data. Then the receiver asks on StackOverflow "how 
do I convert this 'Mainframe' data in C#/Java/PHP/whatever".

So there's piles of... unnecessary and potentially dodgy code... "out there" 
that probably doesn't know about "non-preferred" signs, or the difference 
between a C or F and its potential importance, and what about that leading 
nybble for an even-number of digits? I do like to ask what they think an 
auditor may feel about it... "So, when this important numeric data arrives, 
what do you do with it?" "Well, we change it, in a program".

I think paying for CPU usage teaches you not to do (so many) dumb things.

Funny thing is, it can even be done "natively" in COBOL. 

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to