Hi Maarten,

Thanks for the comments, a few replies below -

Maarten Bodewes wrote:


On Thu, Jul 15, 2010 at 6:57 PM, Sean Mullan <sean.mul...@oracle.com <mailto:sean.mul...@oracle.com>> wrote:

    I would like to try to fix a long-standing XMLDSig issue with the
    current DSA and ECDSA signature bytes format.

    The format of the Signature bytes for these algorithms is an ASN.1
    encoded sequence of the integers r and s:

     SEQUENCE ::= { r INTEGER, s INTEGER }

    Unfortunately, this is not compatible with XMLDSig (and other
    signature formats like .NET), which doesn't ASN.1 encode them and
    simply base64 encodes the raw bytes of r and s concatenated (the
    IEEE P1363 format).


There are more standards that use the P1363 format. Personally I'm involved with the EAC specification for ePassports & Java. You'll find this kind of signature if you look at the EAC certificates for the inspection systems (and their CA's).

    So, our XMLDSig implementation always has to strip off, or decode
    the ASN.1 stuff after calling Signature.sign() when generating
    signatures, and ASN.1 encode the signature bytes before calling
    Signature.verify() when verifying signatures. I could live with this
    until now because it was limited to DSA which wasn't in wide use.
    But now the same problem comes up with ECDSA.


That is a very well known situation for me :). I don't directly remember though if I had to do normalization on the integers as well (stripping of 00h bytes at the front or adding 00h bytes to get to the correct bit-size of the signature elements), or if s & r were encoded as ASN.1 octet strings.

Yes, your memory is correct.

    I would really like to clean this up. There seems to be a couple of
    ways we could fix this:

    1. Add new standard signature format strings that identify the
    format: ex:

     SHA1withDSAandP1363
     SHA1withECDSAandP1363
     SHA256withECDSAandP1363
     SHA384withECDSAandP1363
     SHA512withECDSAandP1363

    I like this the best, but one issue with this is that the "and"
    extended format is reserved for MGF functions, ex: MD5withRSAandMGF1
    and this is not a mask generation function. My suggestion is that we
    use a keyword (ex: Format) that clearly distinguishes it from an MGF:

     <digest>with<encryption>and<format>Format

    ex:

     SHA256withECDSAandP1363Format


I second this solution, since they would also be usable by other applications. I've got a serious problem with the solution though: hardware providers may not support it. And if HW providers do not support it then you need to work around it. Fortunately, if I'm not mistaken, you can work around this by creating a very simple provider that performs the wrapping/unwrapping of the signature (as you don't need to sign).

Yes, but I believe this is no different than what is done today. The Java provider does the DER encoding/decoding and the underlying hardware impl does the verification/signing.

Of course, by now the string build-up of the signature format is getting really complicated (you could say it is starting to imitate life). In the end it might be a good idea to replace it by something that can be selected/verified at compile time (e.g. a list of signature parameters). Currently it might be a good idea to create a constant somewhere for these kind of strings.

Yes, point taken. In practice though this should not be too much of an issue. We'll document the new algorithms in the standard algorithms document [1] so developers can cut-paste them into their code.

    3. Add a higher level DSA/ECDSA Signer API that returns the r and s
    as BigIntegers and leaves the encoding of those bytes to the
    application.

    This is a very clean solution, but is more of a significant API
    change as it would be introducing a new higher level API for
    generating/validating signatures.


Would that not be a *lower* level API, since it does not do the encoding?

Yes, if you look at it that way. Actually, probably another solution would be to enhance the Signature API to support algorithm-specific signature objects (instead of bytes), but I would be very hesitant to do that just to support this option.

Of course, in the end we might want to replace the current JCA with one that uses the factory principle and immutable Signer and Verifier classes, but that is an entirely different discussion :)

    4. Do nothing

    Live with it :(


Nah, if you want to go for 1), then go for it. No current code would break, it's a standardized algorithm you are implementing and other people like me are using it.

Thanks.

--Sean

[1] http://download-llnw.oracle.com/docs/cd/E17409_01/javase/6/docs/technotes/guides/security/StandardNames.html

Reply via email to