> If we wanted to inconvenience everyone equally, the
> attribute names could be gibberish.
> 
Or binary.  Even your argument goes full circle.  And because we're
talking about information on the wire, why not stick to what the end
points process best?

Not that ASN.1 should be defended, but your argument needs its holes
poked :-)

> If
> you're thinking of binary-decimal conversion overheads, one could
> insist upon using octal representation (as tar does), which can be
> converted more quickly than decimal, though that does make it a little
> less human-friendly.
> 
The human-friendly bit is a chimera, It's data on the wire, it was
never the intent of CCITT (I presume) to treat it as anything else.
Much as you can't detect the magnetic properties of core to decide
what is in memory, you need some sort of translator to inspect the
data on the wire.  In CCITT terms, that isn't a problem as cost was
never a consideration when investing in projects with 30- to 50-year
life spans.  It's the urgency of new technology that forces costs all
the way down and deprives R&D departments of multi-digit budgets.

> There has to be mapping between message format and something like a C
> struct (some internal representation).  ASN.1 attempts to describe
> both, as I understand it, to facilitate conversion between the two.

I'm not an ASN.1 guru, never will be.  But I think ASN.1 is intended
to be the permanent representation, with ephemeral internal formats as
dictated by processing.  I'm not sure where in the ITU-T
recommendations I'd have to look to contradict you, however.

It is perfectly possible for the ITU-T committee to have decided to
intrude where they did not belong.  They certainly mirrored the
computer architectures of the time, with various predefined integer
sizes as well as a variable length alternative, so they are guilty of
parochial thinking.

++L

Reply via email to