On 8/17/2017 1:44 PM, Michael StJohns wrote:
See inline.

On 8/17/2017 11:19 AM, Adam Petcher wrote:


Specifically, these standards have properties related to byte arrays like: "The Curve25519 function was carefully designed to allow all 32-byte strings as Diffie-Hellman public keys."[1]

This statement is actually a problem. Valid keys are in the range of 1 to p-1 for the field (with some additional pruning). 32 byte strings (or 256 bit integers) do not map 1-1 into that space. E.g. there are some actual canonical keys where multiple (at least 2) 32 byte strings map to them. (See the pruning and clamping algorithms). The NIST private key generation for EC private keys mitigates this bias by either (a) repeatedly generating random keys until you get one in the range or (b) generating a key stream with extra (64) bits and reducing that mod p of the curve.

If you are concerned about the distribution of private keys in these standards, then you may want to raise your concerns on the CFRG mailing list (c...@irtf.org). I don't have this concern, so I think we should implement the RFC as written.



RFC 8032 private keys: These are definitely bit strings, and modeling them as integers doesn't make much sense. The only thing that is ever done with these private keys is that they are used as input to a hash function.

Again - no. The actual private key is what you get after stage 3 of section 5.1.5. E.g. generate a random string of 32 bytes. Hash it to help with the bad random generators (*sheesh*), Interpret the hash after pruning as a little endian integer.

I'm not sure I understand what you are suggesting. For Ed25519, the initial 32-byte secret is hashed to produce 64 bytes. The first 32 bytes are pruned, interpreted as an integer, and used to produce the public key. The second 32 bytes are also used in the signing operation and contribute to a deterministic nonce. See RFC 8032, section 5.1.6, step 1. Are you suggesting that we should represent all 64 bytes as an integer?

Reply via email to