> In order to exclude possible DOM document issue I modified
> signBase64SignatureDSA function
> so that the same hash is signed in a loop. That produces
> in the end different rawLenS as well, number of loops varies but
> eventually different length is returned which most probably is
> the reason for signature (that is verify) to fail. The actual signing
> is done by OpenSSL ecdsa_do_sign function but debugging of it would
> require better understanding of EC algorithms.

I don't know anything about DSA, such as whether a given input and key always 
produce the same output, or if there's a nonce of some kind involved that would 
change it (it would be embedded in the signature itself so the verifier would 
recover it). I didn't think so.

If that is not so, then I think it would be useful to get a determination using 
that modified routine as to where the change happens. If you can determine 
whether the ECDSA_SIG structure content after signing ever changes for a given 
hash and key, then I think we'd know what routine is causing this.  If that's 
supposed to stay constant and doesn't, then we know that OpenSSL has a bug. At 
least I don't see any other conclusion there, I can't make it work if the 
signing operation returns an incorrect result.

Either way, I don't see a bug in the code that's running after that, at least 
nothing that would behave non-deterministically. It's just encoding the two 
values as octets and then base64'ing it.

Presumably this is what's changing:

unsigned int rawLenS = BN_bn2bin(dsa_sig->s, (unsigned char *) 
&rawSigBuf[rawLen]);

That is OpenSSL also of course.

Anyway, I think the critical thing is, where does the variance happen?

-- Scott



Reply via email to