I was exploring what legitimate reasons/excuses there could be for a CA
to have serial numbers that happen to be 64 bits long, not good ways to
generate serial numbers.

The overall context is to propose automated tests to be run against CT
data or other certificates to detect CAs that fail to include the
required 64 bits of entropy in the serial numbers (as per the BRs).

If you have access to a CA implementation, you may want to check if it
issues short (<= 64 bits) serial numbers and if it actually complies
with the >=64 bit entropy BR requirement, just as a precaution.

On 29/12/2017 18:48, Ryan Sleevi wrote:
Or just generate longer serials with random.

Which is much simpler.

On Fri, Dec 29, 2017 at 11:57 AM, Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

On 29/12/2017 13:55, Nick Lamb wrote:

On Fri, 29 Dec 2017 07:24:31 +0100
Jakob Bohm via dev-security-policy
<dev-security-policy@lists.mozilla.org> wrote:

3. Or would the elimination in #2 reduce the entropy of such serial
     numbers to slightly less than 64 bits (since there are less than
2**64 allowed values for all but the first such certificate)?


The tremendous size of the numbers involved means that in practice this
makes no difference. A single collision only becomes likely (not
certain, merely likely) over the course of issuing billions of such
certificates.

If I'm right a decision to append a further byte (say 0x42) to the
serial number any time a collision would otherwise occur would have
the same _visible_ effect as just throwing away the colliding number and
choosing another, ie no effect because collisions don't actually
happen in practice.

[ In my day job I maintain a system which uses a 64-bit hash of URLs to
index them. We are conscious that by the pigeon hole principle this hash
could sometimes confuse two URLs and there's a safeguard to detect that.
Despite processing millions of URLs this way every day, for several
years, the safeguard has never triggered outside of unit tests. Perhaps
one day it will. ]

It wouldn't surprise me if some CAs actually don't check #2 at all.
Since collisions are so unlikely with truly random serial numbers it
might well never come up, even if you explicitly looked for it, so that
this "failure" might have no detectable consequence for a smaller CA
even over the course of decades of operation.

So far as we know ISRG / Let's Encrypt are issuing the largest volume
from a single subCA of any CA, but I believe they just use a lot more
than 64-bits, which is a rational choice here to avoid answering tricky
philosophical questions about integers. I would commend this approach
to other CAs wondering how best to comply.

Final thought: The linter should check for at least 64-bits, but it
can't check for true randomness (doing so may be literally impossible in
fact) so anything further should be left for human observers and/or CA
auditors.

Nick.


However, the "random-with length-extension on collision" algorithm has
some other (non-BR) shortcomings:

- It is sailing very close to the edge of compliance for little or no
  good reason.

- If the collision case is ever triggered, chances are high that other
  parts of the CA system have not been tested with different length
  serial numbers from the same issuing CA, thus causing larger failures,
  such as OCSP responders returning incorrect results for the colliding
  certificates.

Another algorithm that would produce occasional <= 64 bit serial numbers
while remaining compliant is:

- Generate 64 random bits.
- Append a 15 bit counter resulting in a 15 to 79 bit serial number (up
  to 10 bytes).
- Rotate issuing CAs before the counter wraps.

Here a small fraction (1 in 65536 certificates thus about one per two
issuing CAs if all 32768 counter values are used) will randomly be 64
bits or shorter due to the first 16 random bits being zero.

So

- 63-bit serial numbers are highly suspicious (the algorithm you
  provided would produce them for half the certificates, other algorithms
  much less frequently, if at all).

- 64-bit serial numbers would be cause for concern, but would require
  more thorough analysis, perhaps even a code audit.

- >= 65-bit serial numbers are a lot less likely to be violating the
  entropy requirement BR.

So perhaps:

- Generate an informational warning for 64-bit serial numbers.

- Generate a stronger warning for 33 to 63-bit serial numbers.

- In a separate tool check if an issuing CA produces shorter-than-64-bit
  serial numbers more frequently than a random distribution (with some
  margin).

- One simple test could be to dump the serial numbers from an issuing CA
  (without DER formatting, leading zeroes etc.) as bare unsigned numbers
  with no fomatting, pipe this into gzip -9 or zlib and check if the
  resulting output is less than 8 bytes per certificate.

- A more sophisticated whole-CA test could first sort the serials
  numerically, then do pairwise subtracts of all but the first before
  converting to bare (signed) binary numbers and compressing.  This would
  reduce false entropy from pure counter bits.





Enjoy

Jakob
--
Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
This public discussion message is non-binding and may contain errors.
WiseMo - Remote Service Management for PCs, Phones and Embedded
_______________________________________________
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy

Reply via email to