G’day Paul,

I cannot speak for other CAs, I can only surmise what another CA that is as 
risk intolerant as we are might do. For us, we will collision test since there 
is some probability of a collision and the test is the only way to completely 
mitigate that risk.
There is a limitation in our current platform that sets the serialNumber 
bit-size globally, however we expect a future release will allow this to be 
adjusted per CA. Once that is available, we can use any of the good suggestions 
you have made below to adjust all our Public Trust offerings to move to larger 
entropy on serialNumber determination.

However, the following is the wording from Section 7.1 of the latest Baseline 
Requirements:
“Effective September 30, 2016, CAs SHALL generate non-sequential Certificate 
serial numbers greater than zero (0) containing at least 64 bits of output from 
a CSPRNG.”

Unless we are misreading this, it does not say that serialNumbers must have 
64-bit entropy as output from a CSPRNG, which appears to be the point you and 
others are making. If that was the intention, then perhaps the BRs should be 
updated accordingly?

We don’t necessarily love our current situation in respect to entropy in 
serialNumbers, we would love to be able to apply some of the solutions you have 
outlined, and we expect to be able to do that in the future. However we still 
assert that for now, our current implementation of EJBCA is still technically 
complaint with the BRs Section 7.1 as they are written. Once an update for 
migration to larger entropy serialNumbers is available for the platform, we 
will make the adjustment to remove any potential further isssues.

Regards,
 

-- 

Scott Rea

On 2/25/19, 1:32 PM, "dev-security-policy on behalf of Paul Kehrer via 
dev-security-policy" <[email protected] on behalf 
of [email protected]> wrote:

    Hi Scott,
    
    Comments inline.
    
    On February 25, 2019 at 4:58:00 PM, Scott Rea via dev-security-policy (
    [email protected]) wrote:
    
    G’day Corey,
    
    To follow up on this thread, we have confirmed with the developers of the
    platform that the approach used to include 64-bit output from a CSPRNG in
    the serialNumber is to generate the required output and then test it to see
    if it can be a valid serialNumber. If it is not a valid serialNumber, it is
    discarded, and new value is generated. This process is repeated until the
    first valid serialNumber is produced.
    
    This process ensures that 64 bits output from a CSPRNG is used to generate
    each serialNumber that gets used, and this is complaint with the BRS
    Section 7.1.
    
    This approach (assuming it is accurately described) discards exactly half
    of all values, thus halving the address space. That means there are 63-bits
    of entropy, so I do not agree that this process is compliant with the
    baseline requirements. More generally, RFC 5280 allows up to 20 octets in
    the serial number field so why are you choosing to issue on the lower bound?
    
    
    
    I will also point out that if the returned value is a valid as a
    serialNumber, it is further checked to see if that value has not been used
    before, since there is obviously a minimal chance of collision in any truly
    random process. In this case the serialNumber value will also be discarded
    and the process repeated.
    
    I don't believe all public CAs do collision detection because many have
    chosen to implement serial generation such that collision is highly
    improbable. For example, a CA may choose to generate a 160-bit value and
    clamp the high bit to zero. This provides 159-bits of entropy, with a
    collision probability of roughly 1 in 2 ** 79.5. Alternately, a CA might
    choose to issue with 80-bits of entropy concatenated with a 64-bit
    nanosecond time resolution timestamp. This provides 1 in 2 ** 40 collision
    probability for any given nanosecond. As a final example, Let's Encrypt's
    Boulder CA generates a 136-bit random value and prefixes it with an 8-bit
    instance ID:
    
https://github.com/letsencrypt/boulder/blob/a9a0846ee92efa01ef6c6e107d2e69f4ddbea7c0/ca/ca.go#L511-L532
    
    1 in 2 ** 79.5 is roughly as probable as a randomly generated number
    successfully passing typical Miller-Rabin primality testing while in
    reality being composite. This is not a risk we worry about when creating
    new root keys.
    
    
    I think it reasonable to expect that EVERY implementation of a compliant CA
    software is doing this post-processing to ensure the intended serialNumber
    has not already been used, and this is not something unique to EJBCA. As
    such, every CA out there will have some process that requires
    post-processing of whatever value is returned with a possibility to have to
    repeat the process if there is a collision.
    
    
    
    Regards,
    
    
    -- 
    
    Scott Rea
     

Scott Rea | Senior Vice President - Trust Services 
Tel: +971 2 417 1417 | Mob: +971 52 847 5093
[email protected]

The information transmitted, including attachments, is intended only for the 
person(s) or entity to which it is addressed and may contain confidential 
and/or privileged material. Any review, retransmission, dissemination or other 
use of, or taking of any action in reliance upon this information by persons or 
entities other than the intended recipient is prohibited. If you received this 
in error, please contact the sender and destroy any copies of this information.

_______________________________________________
    dev-security-policy mailing list
    [email protected]
    https://lists.mozilla.org/listinfo/dev-security-policy
    


 






_______________________________________________
dev-security-policy mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-security-policy

Reply via email to