I don't know about others, but I am quite disappointed by Symantec's proposed remediation plan. Intentional or not, these response seems to indicate they don't really understand the potential consequences of many of their past actions. Essentially, they promise to:
1) Have a third party audit all of their EV certs 2) Have a third party audit all of their partner certs 3) Have quarterly audits 4) Will offer certs with 3 month validity periods 5) Engage better with CAB and browser programs to help the ecosystem These steps, while no doubt appealing to Symantec and its customers, do not address the significant relying party risks introduced by their past actions, including allowing various third parties carte blanche to issue certs chaining to publicly trusted roots without meaningful oversight (issues L, W, and Y). This is compounded by apparent institutional difficulties with properly identifying the scope of problems and resolving them (e.g. why did SHA-1 Issue H not lead to procedures so Issue J didn't happen; on Issue D, why did the first set of test cert mis-issues not catch the March 2016 ones?). Further doubts as to the trustworthiness of Symantec's PKI comes from the significant lack of oversight (intentional or not) even in cases where they *knew* there were problems (e.g. Issues P,Q,T,V). In light of this history (as well as related history for other CAs discussed on this forum in the past), on what basis are relying parties supposed to conclude that "more audits" and a "promise to do better" is a sufficient response? It seems to me the existing Symantec PKI is a mess and any steps short of complete distrust of all existing roots leaves relying parties exposed to significant risk. No-one (apparently not even Symantec given demonstrated past inability to identify similar issues within their PKI) has a full view of all the past actions (e.g. cross-signs, creation of unconstrained CAs, etc.) under their existing roots; and the scope of issues here are more serious than other cases that have led to full dis-trust under Mozilla's program. The problem of course is compatibility (Symantec's own plan essentially says "yes, many bad architectural decisions were made by us and our (mostly large enterprise) clients, so we are too big to fail and you can't do drastic things"). But this does not absolve Symantec's existing PKI of their 6+ year history of poor decisions and management. So what about the following: plan a scheduled phasing out of trust in existing Symantec certificates (timeline from Google's proposal seems reasonable), but with all certificates chaining to existing roots, and the old roots themselves, distrusted in the final milestone. This may seem more extreme, but with one addition there are some attractive features that actually reduce compatibility risks (especially to non-browser facing systems), allows symantec to rearchitect their public PKI to follow practices that should help avoid complete distrusts in the future, and gives stronger assurances to relying parties: To deal with the compatibility consequences, during this timeframe, permit Symantec to generate and begin using new root CAs. These roots could/should be unidirectionally cross-signed by one or more of their existing (but to be removed) roots, so that they can begin issuing replacement certificates ~immediately for their customers from sub CAs under the new roots. The plan would then be to strive to have these new roots incorporated in the trust store by the time of the final milestone above (and given Symantec's public statements to support their customers, they would actually do this). >From the perspective of the public web PKI, this "cleans the slate", and >allows the various root programs to clearly articulate requirements under >which these new roots operate (eg mandatory disclosure and auditing of all >subCAs and cross-signed roots (and their subCAs) *technically capable* (even >if administratively constrained or constrained by technical means not >recognized by public web PKI browsers) of issuing browser trusted >certificates, CT logging, validation methods, certificate validity limits, >etc). Since these new roots don't have legacy baggage, any violations of root >program policy thus become clear cut, simplifying monitoring. If done right, this approach might even help *reduce* compatibility issues. Each server using existing Symantec certificates falls into one of three categories: 1) services solely non-browser clients (eg fixed firmware devices, apps,...) 2) services solely browser clients 3) services both 1&2 #1 should be completely unaffected by the above plan (continue to use Symantec's old PKI which is now essentially a large private PKI). #2 has three sub-categories: 2a: solely browsers managed by enterprise policy: these can be made immune to the above (and continue to use Symantec's old PKI) if the to be publicly distrusted roots can be added as private roots (or an enterprise setting to achieve the same effect) on the clients. Or, they pay the one-time effort of moving to certificates to the new Symantec roots. Of course the former comes at some cost to security of those users, but Crucially reduces compatibility issues by giving enterprises flexibility in planning their internal certificate changes (vs the original Google proposal where the not distrusted but not trusting old certs logic makes this much harder). 2b: solely browsers not managed by enterprise policy. Here is compatibility risk comparable to the original Google proposal, but without the potential security risks from undisclosed baggage under old roots. This plan requires no more work on the admins part than any other change of certificate (eg in the original Google proposal) and the cross-sign of the new roots by the old ensures non-updated clients retain access. 2c: some policy managed and some unmanaged. Appropriate admixture of 2a/2b. #3: if browsers are all managed, this is equivalent to 2a. If some browsers are unmanaged, here is the biggest risk, since it is possible there are non-browser cert pins,etc, that are mutually exclusive with using certs from new roots to keep trust in new browsers. But this is no greater risk than in the original Google proposal, and the number of systems in this category should be low (relative to the other categories), since usual architectures would point fixed function devices at api-stable subdomains rather than more frequently changed browser-accessed ones. Further,there are pure server-side solutions that could address these cases if absolutely required (cf cloudflare sha1 serving to legacy clients). I can understand that some previous CAs in the mozilla program might complain that the above is unfair (why does Symantec get to immediately propose new roots, while we did not), but this just reflects the reality Even if you ignore all the above, I don't care which way you slice it, the actions of Symantec on these issues means they should not be trusted with EV for a long long time, as their policy seems to have been: what their customer wants, their customer gets. _______________________________________________ dev-security-policy mailing list dev-security-policy@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-security-policy