I don’t think there’s anything inherently wrong with an approach that uses a fixed prefix, whether of one bit or more, provided that there is at least 64 bits of entropy included in the serial prior to encoding to DER.
This means a scheme with guarantees a positive INTEGER will generate *encoded* serials in the range of one bit to sixty five bits, of the goal is to use the smallest possible amount of entropy. However, as you note, this issue only arises when one uses the absolute minimum. A robust solution is to use 159 bits, the maximum allowed. This helps ensure that, even when encoded, it will not exceed 20 bytes, this avoiding any client interpretation issues regarding whether the 20 bytes mentioned in 5280 are pre-encoding (the intent) or post-encoding (as a few embedded libraries implemented). Note, however, even with 159 bits of entropy, it’s still possible to have a compressed encoding of one byte, due to zero folding. Using a one bit prefix in addition to the sign bit (thus, two fixed bits in the serial) can help ensure that a leading run of zero bits are not folded when encoding. _______________________________________________ dev-security-policy mailing list [email protected] https://lists.mozilla.org/listinfo/dev-security-policy

