http://www.contextis.com/documents/33/Exploiting_XML_Digital_Signature_Implementations-HITBKL20131.pdf
By signing XML content, rather than the raw bytes of an XML document, the W3C were faced with a problem, specifically the possibility that intermediate XML processors might modify the document's physical structure without changing the meaning. At this point you are permitted to start chuckling, privately. An obvious example is text encodings. As long as the content is the same there is no reason why an XML file stored as UTF-8 should not have the same signature value as one stored as UTF-16. There are other changes which could occur which don't affect the meaning of the XML but would affect its physical representation, such as the order of attributes, as the XML specification does not mandate how a processor should serialize content. Eyebrows raised. With this problem in mind the W3C devised the canonical XML specification which defines a series of processing rules which can be applied to parsed XML content to create a known canonical binary representation. For example, it specifies the ordering of attributes, and mandates the use of UTF-8 as the only text encoding scheme. Summary: We won't specify how you serialize it, only how you serialize it to validate the signature. As a result, you have to parse the untrusted message and expose parsing and canonicalization to the anonymous attack surface before determining the signature is invalid, assuming you even managed to check that properly: https://lists.w3.org/Archives/Public/public-xmlsec/2009Nov/att-0019/Camera-Ready.pdf http://www.slideshare.net/44Con/the-forgers-artjamesforshaw44con2k13 https://www.owasp.org/images/5/5a/07A_Breaking_XML_Signature_and_Encryption_-_Juraj_Somorovsky.pdf https://www.usenix.org/system/files/conference/usenixsecurity12/sec12-final91.pdf Countermeasures: http://arxiv.org/pdf/1401.7483.pdf Proposed that the anonymous attack surface be required to do minimum processing on untrusted input before authentication/authorization. That means no parsing, nothing more complicated than slicing off a signature and validating it. Proposed that this not just encourages security in the non-authenticated case, it also minimizes the work to validate the security of the anonymous attack surface. Open question: how much flexibility in cipher negotiation or choices and serialization can be done safely during this stage. Compare OpenSSL. Considered that flexibilty (which requires more complex pre-auth logic) comes with risk, but if chosen carefully can be minimized. -- http://www.subspacefield.org/~travis/ | if spammer then j...@subspacefield.org "Computer crime, the glamor crime of the 1970s, will become in the 1980s one of the greatest sources of preventable business loss." John M. Carroll, "Computer Security", first edition cover flap, 1977
pgp52BlDQo4gU.pgp
Description: PGP signature
_______________________________________________ langsec-discuss mailing list langsec-discuss@mail.langsec.org https://mail.langsec.org/cgi-bin/mailman/listinfo/langsec-discuss