QuoVadis: Failure to revoke key-compromised certificates within 24 hours
Three certificates were reported as having private keys which had been publicly disclosed, by e-mailing complia...@quovadisglobal.com at 2020-03-20 03:05:14 UTC. E-mail was received by a QuoVadis server at 2020-03-20 03:05:18 UTC. As of 2020-03-22 05:17:37, OCSP still shows all of these certificates as being "Good". The unrevoked certificates are: https://crt.sh/?id=2605016622 https://crt.sh/?id=1757153116 https://crt.sh/?id=1432019792 Interestingly, at least one other certificate using the same private key as each of the above certificates, and also issued by QuoVadis, are now showing as revoked, suggesting that (a) QuoVadis did indeed consider the private keys as compromised, and (b) there are no caching or delayed publishing issues at play here. - Matt ___ dev-security-policy mailing list dev-security-policy@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-security-policy
Digicert: failure to revoke certificate with previously compromised key
Certificate https://crt.sh/?id=2606438724, issued either at 2020-03-21 00:00:00 UTC (going by notBefore) or 2020-03-21 01:56:31 UTC (going by SCTs), is using a private key with SPKI 4310b6bc0841efd7fcec6ba0ed1f36e7a28bf9a707ae7f7771e2cd4b6f31b5af, which was reported to Digicert as compromised on 2020-03-20 02:05:49 UTC (and for which https://crt.sh/?id=1760024320 was revoked for keyCompromise soon after certificate 2606438724 was issued). As previously discussed on this list, the visible consensus is that, according to the BRs, certificates for which the CA already had evidence of key compromise must be revoked within 24 hours of issuance. That 24 hour period has passed for the above certificate, and thus it would appear that Digicert has failed to abide by the BRs. - Matt ___ dev-security-policy mailing list dev-security-policy@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-security-policy
Re: Paessler (was Re: Let's Encrypt: Failure to revoke key-compromised certificates within 24 hours)
On Sat, Mar 21, 2020 at 07:20:27PM +, Nick Lamb wrote: > On Sat, 21 Mar 2020 13:40:21 +1100 > Matt Palmer via dev-security-policy > wrote: > > There's also this one, which is another reuse-after-revocation, but > > the prior history of this key suggests that there's something *far* > > more interesting going on, given the variety of CAs and domain names > > it has been used for (and its current residence, on a Taiwanese > > traffic stats server): > > > > > > https://crt.sh/?spkisha256=69fc5edbd904577629121b09c49b711e201c46213e5b175bbee08a4d1d30b3c7 > > > > If anyone figures out the story with that last key, I'd be most > > pleased to hear about it. > > Sure. [snip story] Ha ha! Nice detective work. It was the old wildcard for `*.new-access.net` that threw me for a loop, but I suppose if someone's going to reuse a key, why not reuse one for a wildcard? Thanks, I can now sleep a little bit sounder now that I know there isn't another Debian-style weak PRNG out there. - Matt ___ dev-security-policy mailing list dev-security-policy@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-security-policy
Paessler (was Re: Let's Encrypt: Failure to revoke key-compromised certificates within 24 hours)
On Sat, 21 Mar 2020 13:40:21 +1100 Matt Palmer via dev-security-policy wrote: > Oh the facepalm, it burns (probably too much hand sanitizer)... let > me try that again. Use soap and water where practical. And, as the BBC Comedy TV show "That Mitchell & Webb Look" put it many years ago "Remain indoors". > There's also this one, which is another reuse-after-revocation, but > the prior history of this key suggests that there's something *far* > more interesting going on, given the variety of CAs and domain names > it has been used for (and its current residence, on a Taiwanese > traffic stats server): > > > https://crt.sh/?spkisha256=69fc5edbd904577629121b09c49b711e201c46213e5b175bbee08a4d1d30b3c7 > > If anyone figures out the story with that last key, I'd be most > pleased to hear about it. Sure. This requires a small degree of insight into how little ordinary people (even say IT people) understand about public key cryptography. These servers are running PRTG - a network monitoring tool from an outfit named Paessler. The software offers a web interface with SSL. PRTG is supplied as Windows software, and I have just installed it on my games PC (hopefully uninstalling it will be easy because this is no time to go out shopping for a PC) to verify the following: Rather than mint an RSA key pair and self-signed certificate to bootstrap each install, they just supply a (presumably randomly generated) key and certificate right in the install data. They don't have one of those (often rather archaic but functional) UIs where it mints new RSA keys and gives you a CSR for them. Instead it offers either a tool that will convert keys and certificates and install them, or you can just paste the files into the right place and restart the software. Now, for you or me the provided default RSA key is obviously no use and you'd mint your own with your preferred tools before requesting a publicly trusted certificate or indeed using your own in-house CA. But if you don't know much about this stuff and you find there's a perfectly nice RSA key supplied with the software it seems natural to use it. Whereupon of course now your "real" publicly trusted certificate is for a key which in reality is available to anybody with the insight to guess which software you're using. Oops. Here's their demo certificate, the associated Private Key is freely available to download as part of their software, but there's no need for me to paste it here. -BEGIN CERTIFICATE- MIIDpjCCAo6gAwIBAgIJAMM2JGwQ4/iqMA0GCSqGSIb3DQEBBQUAMEAxHjAcBgNV BAoTFVBSVEcgRGVtbyBDZXJ0aWZpY2F0ZTEeMBwGA1UEAxMVUFJURyBEZW1vIENl cnRpZmljYXRlMB4XDTEzMDcwODExMTUwNVoXDTIzMDcwNjExMTUwNVowQDEeMBwG A1UEChMVUFJURyBEZW1vIENlcnRpZmljYXRlMR4wHAYDVQQDExVQUlRHIERlbW8g Q2VydGlmaWNhdGUwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCsR6TJ IF2cRzoUfElst4CxY3q6vnWzZ0U0wrO6pdrVbrWqVcmofC9bxiLpW8AtlsQ0cVAQ r64juKbivQV6ggpIrFpYE505VDbu6tvqYR8nY2wtNJZNwKhT0hpBNmgujceaDc/q ghTIzaZGbtzas7HX1g8bBs81mw0TUI+IJNDAz+tbQM0NPxl/BY0LSuRX7ApUp/jn veUWXzpBb8BbCriQXPeykQuVXF2oWZ4d5B6X8mxl4GhzjmoQsTr0xGi0pWz1Tc0h Wkcd0hU633Hw1tjL82j8x5uEwy/nrb3ShMOzKtVpsoFA0TBc5BaIgbQvJpBk0Qd6 cfCxnLPjZQj4+AcFAgMBAAGjgaIwgZ8wHQYDVR0OBBYEFO6ncMKuxL4p7cwozSn1 USIYzEK5MHAGA1UdIwRpMGeAFO6ncMKuxL4p7cwozSn1USIYzEK5oUSkQjBAMR4w HAYDVQQKExVQUlRHIERlbW8gQ2VydGlmaWNhdGUxHjAcBgNVBAMTFVBSVEcgRGVt byBDZXJ0aWZpY2F0ZYIJAMM2JGwQ4/iqMAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcN AQEFBQADggEBAKm6SueZqG7mVSyls2D/kFPoxsh1inctOeQPHbwMAVMCD68KGlJh kSicHq7bISy0aSioGRZe6rS12bcYRkqtgg0DjQ+ZmtHPBJTgrXIqZW0jHuqN6vyS d4IDCNQGrQQgQ+uC6V71EDcM6WDULuDygqdvM2D1gc8u2di8Rp3MpKfHAi8n0yRu 00B+01aqce/EA0b0dBPeJciKfB1cAU3CEGoLNVS/F8skumn7Q/kWwbuyjz0Nb66m 3WJOu1yAXPalEdRHQIiXEbnJgT5YrNU1R74CSdOATSKjk6kkWromGH63onF8wSS0 hh/btapuzGY6VPSscqMh3k9ji0+sPdxy3+U= -END CERTIFICATE- Nick. ___ dev-security-policy mailing list dev-security-policy@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-security-policy
Re: Auditing of CA facilities in lockdown because of an environmental disaster/pandemic
On Friday, March 20, 2020 at 3:55:08 PM UTC-5, Ryan Sleevi wrote: > On Fri, Mar 20, 2020 at 4:07 PM Kathleen Wilson via dev-security-policy < > dev-security-policy@lists.mozilla.org> wrote: > > > My question: What should "location" mean in the above requirement? > > > > The WebTrust Practitioner Guidance offers a reasonable definition: > https://www.cpacanada.ca/en/business-and-accounting-resources/audit-and-assurance/overview-of-webtrust-services/practitioner-qualification-and-guidance > > CA Processing Locations > All reports issued should list the city, state/province (if applicable), > and country of all physical locations > used in CA operations. This includes data center locations (primary and > alternate sites), registration > authority locations (for registration authority operations performed by the > CA), and all other locations > where general IT and business process controls that are relevant to CA > operations are performed. > > > > For example, if a CA happens to have two facilities in the same city > > that should be audited, how can the audit statement clearly indicate if > > all of that CA's facilities were audited without providing the exact > > physical addresses? > > > We're primarily interested in making sure that the auditor examined /both/ > facilities for the appropriateness of controls. ETSI's lack of rigorous > methodology leaves a lot to be desired here, but it's not difficult to > disambiguate by indicating something like > "Facility 1 in City, State, Country" vs "Facility 2 in City, State, Country" > or > "Primary Facility in City, State, Country" vs "Disaster Recovery Facility > in City, State, Country" > > (adjusted as appropriate) Shortly before the COVID-19 pandemic, members of the WebTrust Task Force reviewed this guidance and had discussion focused on whether our reports were providing too much information in a publicly available report as to the operations of a CA. Practitioners have been getting questioned in the past by CAs as to why such specific information should be disclosed to the level of city and state for the location of its operations. It is a good point as certainly not all CAs provide this information freely to all of their employees, let alone outsiders. This is especially true with the larger and more complex CAs. For the more complex CAs, I can envision another Attachment in the audit report, similar to the thumbprint attachment, that lists the locations in a manner that Jeremy suggests that protects the physical location to some degree, yet provides the users of the report enough information to know what was able to be covered. That could be part of our guidance, which of course is jus t that - guidance. Having our guidance adjusted in this manner would certainly help drive consistency that would be helpful to the CABF. I am sure there will be variations in reports, however, as guidance is non-authoritative for AICAP and CPA Canada. As far as the term "CA facility", I'd like to get thoughts from this group as to what that includes. For instance, while a facility hosting an active HSM with CA private keys is a certainly a "CA facility", would you also include in this definition things like a bank safe deposit box that stores a deactivated and encrypted copy of a private key a CA facility? Would you expect this level of information disclosed in an audit report? ___ dev-security-policy mailing list dev-security-policy@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-security-policy