Peter Gutmann wrote:
That cuts both ways though.  Since so many systems *do* screw with data (in
insignificant ways, e.g. stripping trailing blanks), anyone who does massage
data in such a way that any trivial change will be detected is going to be
inundated with false positives.  Just ask any OpenPGP implementor about
handling text canonicalisation.

this was one of the big issues in the asn.1 encoding vis-a-vis xml encoding wars.

asn.1 encoding provided deterministic encoding for signed material, although some of the more common applications of digital signature have what is transmitted is the original encoded material along with the signature of that encoded material.

fstc/e-check project wanted to digital sign stuff that was xml encoded ... but not transmit the xml encoded fields. they wanted to take standard financial transaction fields ... momentarily xml encode the standard fields, digitally sign the encoded material ... and then append the resulting digital signature to the (original) standard transaction for transmission.

the problem was that xml didn't have a deterministic definition for encoding fields. when the recipient/relying party received the transmission ... they had to take the standard transaction fields and re-encode in xml in order to verifiy the digital signature. fstc/e-check came up with fsml for deterministic encoding of fields ... so that the encoding done by the originator (of the digital signature) and the encoding done by the relying party (for verifying the digital signature) would have identical bit patterns.

fsml was subsequently contributed to the xml digital signature project.

xml is descendent of gml invented by "G", "M", and "L" in 1969 at the science center
and then standardized at ISO in the 70s

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to