Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-10-09 Thread Eric Mill via dev-security-policy
(apologies to anyone who gets this twice, my first email got sent to some
spam folders, so I took out the example domain I used)

Hi Paul,

Those statements are both hyperbolic representations of others' points of
view.

There are plenty of people who are skeptical about the effectiveness of EV
and its associated UI who nonetheless believe that some sense of
trustworthiness about websites is important. For example, Mozilla
integrates the Safe Browsing system into its applications to protect users
from malicious websites, regardless of whether the connection to that
website was secure.

There are also plenty of people who don't enjoy the sight of certificates
imitating PayPal domains being issued by Let's Encrypt, and may think that
this is a symptom of a larger problem - but still don't agree that
intervention by the CA is the appropriate tool to handle that problem, for
reasons such as the lack of a formal process for adjucating claims (like
the UDPR for registrars), or a general concern about censorship, or an
observation that malware and phishing sites are often deployed to specific
pages on otherwise-good services and that hostname-level enforcement is a
mismatch for the problem.

Putting the hyperbole aside, the general sentiment behind both of those
statements is consistent, and not something I think is in urgent need of
clarification by Mozilla. Further, there are ongoing efforts to improve
online security and trustworthiness that don't rely on CAs doing anything
at all. Services like Microsoft SmartScreen and Google Safe Browsing are
one of those. The deployment of phishing-resistant WebAuthn-based
authentication is another.

Not speaking for anyone else here, I personally see it as futile to rely on
user education for nearly any aspect of security. Instead, we should be
designing systems that do the right thing on the user's behalf wherever
possible. This doesn't necessarily have to rely on large benevolent central
services: WebAuthn is an excellent example of a standard that can be
implemented by anyone in software or hardware, without getting any
company's permission, and integrated in the way that makes the most sense
for users of that service or platform. WebAuthn relies on the domain name
to resist phishing, but doesn't rely on the user having to watch to make
sure they are at the right domain name. If someone ends up at a similar
phishing domain and doesn't notice, the WebAuthn-based authenticator
(whether a YubiKey, a smartphone fingerprint reader, or a Macbook Touchbar,
etc.) will notice on the user's behalf that the domain name differs from
the website that the authenticator was originally registered at.

That's a very smart model that renders many common classes of attacks
infeasible for even highly well-resourced attackers, while requiring no
user education about how websites or certificates work. And as people get
used to using authenticators that are built into their phone or laptop, and
use the same ceremony that people use to log into those devices themselves
to also log into websites, there will be no user education required for how
to benefit from these advancements.

I'm not trying to argue that WebAuthn alone will save the web, but rather
pointing to it as a fruitful example of where resources are being poured
*instead* of user education or on greater reliance on CAs for ecosystem
monitoring. No one's just sitting back and not caring about phishing:
instead, the ecosystem is responding to the threats as they've observed
them, using a model that is already showing real results in the real world
with real users.


On Tue, Oct 8, 2019 at 2:04 PM Paul Walsh via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

>
> > On Oct 8, 2019, at 4:19 AM, carsten.mueller.gl--- via
> dev-security-policy  wrote:
> >
> >> But the target audience for phishing are uninformed people. People
> which have no idea what a EV cert is. People who don't even blink if the
> English on the phishing page is worse than a 5-year old could produce.
> >>
> >> You cannot base the decision if a EV indication in the browser is
> useful on those people.
> >>
> > The discussions that many users don't even recognize the difference
> between EV/OV/DV certificates is unfortunately true, BUT forced by the
> browsers:
> >
> > When EV certificates were introduced, each browser displayed a green
> address bar including the company name and the country abbreviation of the
> certificate applicant.
> > Gradually the green colouring of the address bar was removed and only
> the company name and country abbreviation were displayed in green.
> > To top it all off, the lock symbol of ALL certificates was displayed in
> green to make the confusion of the users perfect.
> > Google Chrome also removed the green color of the company name.
> >
> > Each browser then had a different display of all certificate types at
> short intervals.
> >
> >
> > In the early days of EV certificates, it was easy for me to tell my
> 

Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-10-09 Thread Eric Mill via dev-security-policy
Hi Paul,

Those statements are both hyperbolic representations of others' points of
view.

There are plenty of people who are skeptical about the effectiveness of EV
and its associated UI who nonetheless believe that some sense of
trustworthiness about websites is important. For example, Mozilla
integrates the Safe Browsing system into its applications to protect users
from malicious websites, regardless of whether the connection to that
website was secure.

There are also plenty of people who don't enjoy the sight of certificates
imitating PayPal domains being issued by Let's Encrypt, and may think that
this is a symptom of a larger problem - but still don't agree that
intervention by the CA is the appropriate tool to handle that problem, for
reasons such as the lack of a formal process for adjucating claims (like
the UDPR for registrars), or a general concern about censorship, or an
observation that malware and phishing sites are often deployed to specific
pages on otherwise-good services and that hostname-level enforcement is a
mismatch for the problem.

Putting the hyperbole aside, the general sentiment behind both of those
statements is consistent, and not something I think is in urgent need of
clarification by Mozilla. Further, there are ongoing efforts to improve
online security and trustworthiness that don't rely on CAs doing anything
at all. Services like Microsoft SmartScreen and Google Safe Browsing are
one of those. The deployment of phishing-resistant WebAuthn-based
authentication is another.

Not speaking for anyone else here, I personally see it as futile to rely on
user education for nearly any aspect of security. Instead, we should be
designing systems that do the right thing on the user's behalf wherever
possible. This doesn't necessarily have to rely on large benevolent central
services: WebAuthn is an excellent example of a standard that can be
implemented by anyone in software or hardware, without getting any
company's permission, and integrated in the way that makes the most sense
for users of that service or platform. WebAuthn relies on the domain name
to resist phishing, but doesn't rely on the user having to watch to make
sure they are at the right domain name. If someone ends up at g00gle.com
and doesn't notice, the WebAuthn-based authenticator (whether a YubiKey, a
smartphone fingerprint reader, or a Macbook Touchbar, etc.) will notice on
the user's behalf that the domain name differs from the website that the
authenticator was originally registered at.

That's a very smart model that renders many common classes of attacks
infeasible for even highly well-resourced attackers, while requiring no
user education about how websites or certificates work. And as people get
used to using authenticators that are built into their phone or laptop, and
use the same ceremony that people use to log into those devices themselves
to also log into websites, there will be no user education required for how
to benefit from these advancements.

I'm not trying to argue that WebAuthn alone will save the web, but rather
pointing to it as a fruitful example of where resources are being poured
*instead* of user education or on greater reliance on CAs for ecosystem
monitoring. No one's just sitting back and not caring about phishing:
instead, the ecosystem is responding to the threats as they've observed
them, using a model that is already showing real results in the real world
with real users.

On Tue, Oct 8, 2019 at 2:04 PM Paul Walsh via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

>
> > On Oct 8, 2019, at 4:19 AM, carsten.mueller.gl--- via
> dev-security-policy  wrote:
> >
> >> But the target audience for phishing are uninformed people. People
> which have no idea what a EV cert is. People who don't even blink if the
> English on the phishing page is worse than a 5-year old could produce.
> >>
> >> You cannot base the decision if a EV indication in the browser is
> useful on those people.
> >>
> > The discussions that many users don't even recognize the difference
> between EV/OV/DV certificates is unfortunately true, BUT forced by the
> browsers:
> >
> > When EV certificates were introduced, each browser displayed a green
> address bar including the company name and the country abbreviation of the
> certificate applicant.
> > Gradually the green colouring of the address bar was removed and only
> the company name and country abbreviation were displayed in green.
> > To top it all off, the lock symbol of ALL certificates was displayed in
> green to make the confusion of the users perfect.
> > Google Chrome also removed the green color of the company name.
> >
> > Each browser then had a different display of all certificate types at
> short intervals.
> >
> >
> > In the early days of EV certificates, it was easy for me to tell my
> mother and " uninformed" friends that they should pay attention to the
> green address bar and the company name displayed there, and if possible not

Re: Fwd: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-15 Thread Eric Mill via dev-security-policy
I'm told my previous message to this thread was flagged as spam for some of
the recipients. But it did get posted to the Google Group, so for those who
didn't get my previous reply, here it is:

https://groups.google.com/d/msg/mozilla.dev.security.policy/iVCahTyZ7aw/tO3k5ua0AQAJ

On Thu, Aug 15, 2019 at 1:59 PM Doug Beattie via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> So far I see is a number of contrived test cases picking apart small
> components of EV, and no real data to back it up.  Mostly academic or
> irrelevant research, imho.  Here are a couple of links posted in this
> thread:
>
>
>
> https://www.typewritten.net/writer/ev-phishing/: This post is intended
> for a technical audience interested in how an EV SSL certificate can be
> used as an effective phishing device  security concern>
>
>
>
> https://stripe.ian.sh/: EV certificates with colliding entity names can
> be generated, but to date, I don’t know of any real attacks, just this
> academic exercise. And how much did it cost and how long did it Ian to get
> certificates to perform this experiment?  Way more time and money that a
> phisher would invest.
>
>
>
>
> https://chromium.googlesource.com/chromium/src/+/HEAD/docs/security/ev-to-page-info.md
> references a number of studies. But none of them indicated that EV was bad
> or misleading or was a detriment to security, and a number of the
> references weren’t even related to EV (including irrelevant research links
> to bolster their claims to the uninformed)
>
>
>
> I haven’t been counting the number of pro and cons emails, but there are a
> significant number of organizations questioning the changes by Google and
> Mozilla.  Mozilla and Google should reconsider their proposed changes.
>
>
>
> Yes, I work for a CA that issues EV certificates, but if there was no
> value in them, then our customers would certainly not be paying extra for
> them.  Shouldn’t the large enterprises that see a value in identity (as
> does GlobalSign) drive the need for ending EV certificates?  With Google
> and Mozilla being prominent Lets Encrypt sponsors we know their intent is
> to drive business to them vs. any of the commercially respectable CAs.
> It’s actually counter productive to security to sponsor a CA that issues so
> many certificates to phishing and malware sites without any consequences.
> Is this to increase the value of their malware site detection services?
> Maybe..
>
> *   https://www.usenix.org/system/files/soups2019-drury.pdf
> *
> https://cabforum.org/wp-content/uploads/23.-Update-on-London-Protocol.pdf
>
>
>
> Baffled…
>
>
>
>
>
>
>
> From: Tom Ritter 
> Sent: Thursday, August 15, 2019 1:13 PM
> To: Doug Beattie 
> Cc: Peter Gutmann ; MozPol <
> mozilla-dev-security-pol...@lists.mozilla.org>
> Subject: Re: Fwd: Intent to Ship: Move Extended Validation Information out
> of the URL bar
>
>
>
>
>
> On Thu, Aug 15, 2019, 7:46 AM Doug Beattie via dev-security-policy <
> dev-security-policy@lists.mozilla.org  dev-security-policy@lists.mozilla.org> > wrote:
>
> Peter,
>
> Do you have any empirical data to backup the claims that there is no
> benefit
> from EV certificates?  From the reports I've seen, the percentage of
> phishing and malware sites that use EV is drastically lower than DV (which
> are used to protect the cesspool of websites).
>
>
>
> I don't doubt that at all. However see the first email in this thread
> citing research showing that users don't notice the difference.
>
>
>
>
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>


-- 
Eric Mill
617-314-0966 | konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Fwd: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-15 Thread Eric Mill via dev-security-policy
On Thu, Aug 15, 2019 at 1:59 PM Doug Beattie via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> So far I see is a number of contrived test cases picking apart small
> components of EV, and no real data to back it up.  Mostly academic or
> irrelevant research, imho.


(posting in my personal capacity)

I don't think it's accurate to characterize the research dismissively as
academic or irrelevant. I also want to point out up top that Safari
announced it was removing the EV indicator over a year ago, in June 2018.


> https://stripe.ian.sh/: EV certificates with colliding entity names can
> be generated, but to date, I don’t know of any real attacks, just this
> academic exercise. And how much did it cost and how long did it Ian to get
> certificates to perform this experiment?  Way more time and money that a
> phisher would invest.
>

Ian states this directly in the post. It is a trivial amount of money and
time:

"One question may be how practical this attack is for a real attacker who
desires to phish someone. First, from incorporation to issuance of the EV
certificate, I spent less than an hour of my time and about $177. $100 of
this was to incorporate the company, and $77 was for the certificate. It
took about 48 hours from incorporation to the issuance of the certificate."


CAs should be careful about casually and dramatically overestimating the
roadblocks that EV certificates present to attackers.

Even if Ian's experiment took 10 times as long in practice, and cost $1000
over a fortnight, this is well within what we should generally expect
attackers to spend on an organized phishing attack. I have been on the
receiving end, as a website owner whose service was spoofed, of
sophisticated phishing attacks, and I've observed attackers who are willing
to spend substantially more than that for what is by all evidence a
lucrative and often successful class of attack.

https://chromium.googlesource.com/chromium/src/+/HEAD/docs/security/ev-to-page-info.md
> references a number of studies. But none of them indicated that EV was bad
> or misleading or was a detriment to security, and a number of the
> references weren’t even related to EV (including irrelevant research links
> to bolster their claims to the uninformed)
>

The burden is not on the web browsers to prove that EV is detrimental to
security - the burden is on third parties to prove that EV is beneficial.
The fact that it's been around for a long time is not sufficient. I don't
see any evidence that any of the links or resources on that page are
designed to mislead uninformed readers.


I haven’t been counting the number of pro and cons emails, but there are a
> significant number of organizations questioning the changes by Google and
> Mozilla.  Mozilla and Google should reconsider their proposed changes.
>

I don't observe a significant number of organizations questioning these
changes, in this thread or externally, other than CAs. Not that there
aren't any, but I'm not seeing a significant hue and cry in the broader
ecosystem.

I certainly can't speak for the US government, but I can say that when I
worked for the executive branch for a federal agency, I observed a strong
trend in adopting DV certificates (typically automated) throughout the
executive branch. One of the more relevant changes I observed agencies make
was the Department of Defense explicitly updating their internal policies
to remove a requirement to use EV certificates for public properties.
Multiple federal agencies gave internal guidance to widely adopt DV
certificates internally, and you can see a public example of that in the
official guidance accompanying the White House's HTTPS directive at
https://https.cio.gov/certificates/#what-kind-of-certificate-should-i-get-for-my-domain
 -

“Domain Validation” (DV) certificates are usually less expensive and more
amenable to automation than “Extended Validation” (EV) certificates. EV
certificates generally result in the domain owner’s name appearing in the
browser URL bar visitors see. Ordinary DV certificates are completely
acceptable for government use.


Given that Safari already removed the EV indicator well over a year ago, I
expect the guidance will be updated so as not to mislead agencies that EV
will continue to generally show their organization's name in browsers.

You can certainly still find EV certificates on some federal agency
websites out there, but overall, the trajectory away from them has been
clear and accelerating for years.


Yes, I work for a CA that issues EV certificates, but if there was no value
> in them, then our customers would certainly not be paying extra for them.


This is definitely not a strong argument. Enterprises do all sorts of
things they believe may be valuable, based on gut feelings or on outdated
best practices.

For example, 5 years ago, it was still conventional wisdom to periodically
rotate user passwords. After years of empirical research demonstrating the

Re: Odp.: Odp.: Odp.: 46 Certificates issued with BR violations (KIR)

2019-02-02 Thread Eric Mill via dev-security-policy
The BRs and Mozilla program policies don't support the idea of just
trusting a CA to issue certs for "internal" use or to keep them secret.
This is why CAs issuing "test certificates" on production CAs for domains
they don't own is clearly forbidden.

Given that, I don't see how it can be acceptable to provide an "unknown"
status over OCSP for a revoked certificate, on the premise that the CA
asserts they never actually shipped the cert to a customer.

The fact that they would have to mark the cert "valid" before marking it
"revoked" is a limitation of the implementation of the OCSP responder. It's
not a reason to ignore policy that is grounded in the very reasonable
desire to ensure that the certificate's revoked status is known to any
client which checks OCSP instead of CRL.

-- Eric

On Sat, Feb 2, 2019 at 4:31 AM Buschart, Rufus via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Personally I think it would be better, if the revoke reason "Certificate
> hold" on the CRL would be allowed for TLS certificates, as this state would
> exactly cover the described scenario. The OCSP responder could in such a
> case reply with "bad" and deliver the reason "certificate hold". But I
> fully understand that browser developers had a lot of issues with this
> state, so it is still forbidden.
>
> With best regards,
> Rufus Buschart
>
> Siemens AG
> Information Technology
> Human Resources
> PKI / Trustcenter
> GS IT HR 7 4
> Hugo-Junkers-Str. 9
> 90411 Nuernberg, Germany
> Tel.: +49 1522 2894134
> mailto:rufus.busch...@siemens.com
> www.twitter.com/siemens
>
> www.siemens.com/ingenuityforlife
>
> Siemens Aktiengesellschaft: Chairman of the Supervisory Board: Jim
> Hagemann Snabe; Managing Board: Joe Kaeser, Chairman, President and Chief
> Executive Officer; Roland Busch, Lisa Davis, Klaus Helmrich, Janina Kugel,
> Cedrik Neike, Michael Sen, Ralf P. Thomas; Registered offices: Berlin and
> Munich, Germany; Commercial registries: Berlin Charlottenburg, HRB 12300,
> Munich, HRB 6684; WEEE-Reg.-No. DE 23691322
>
> > -Ursprüngliche Nachricht-
> > Von: dev-security-policy 
> Im Auftrag von Kurt Roeckx via dev-security-policy
> > Gesendet: Freitag, 1. Februar 2019 23:38
> > An: Wayne Thayer 
> > Cc: mozilla-dev-security-policy <
> mozilla-dev-security-pol...@lists.mozilla.org>
> > Betreff: Re: Odp.: Odp.: Odp.: 46 Certificates issued with BR violations
> (KIR)
> >
> > On Fri, Feb 01, 2019 at 03:02:17PM -0700, Wayne Thayer wrote:
> > > It was pointed out to me that the OCSP status of the misissued
> > > certificate that is valid for over 5 years is still "unknown" despite
> > > having been revoked a week ago. I asked KIR about this in the bug [1]
> > > and am surprised by their response:
> > >
> > > This certificate is revoked on CRL. Because the certificate has been
> > > never
> > > > received by the customer its status on OCSP is "unknown". To make
> > > > the certificate "revoked" on OCSP first we should make it "valid"
> > > > what makes no sense. I know there is inconsistency between CRL and
> > > > OCSP but there are some scenarios when it can be insecure to make it
> > > > valid just in order to make it revoked.
> > > >
> > >
> > > Upon further questioning KIR states:
> > >
> > > Of course I can mark it as revoked after I make it valid, but I think
> > > it is
> > > > more secure practice not to change its status at all when the
> > > > certificate is not received by the customer. Let's suppose the
> > > > scenario when your CA generate certificate and the customer wants
> > > > you to deliver it to its office. What OCSP status the certificate
> > > > should have when you are on your way to the customer office? valid -
> > > > I do not think so. When the certificate is stolen you are in
> > > > trouble. So the only option is "unknown" but then we have different
> > > > statuses on CRL and OCSP - but we are still safe. It is not only my
> opinion, we had a big discuss with our auditors about that.
> > > >
> > >
> > > Does anyone other then KIR and their auditor (Ernst & Young) think
> > > this is currently permitted? At the very least, I believe that
> returning "unknown"
> > > for a revoked certificate is misleading to Firefox users who will
> > > receive the "SEC_ERROR_OCSP_UNKNOWN_CERT" error instead of
> > > "SEC_ERROR_REVOKED_CERTIFICATE".
> > >
> > > Does anyone other than KIR and Ernst & Young believe that this meets
> > > WebTrust for CAs control 6.8.12? [2]
> >
> > If you follow the RFC, the "unknown" answer can mean that it doesn't
> know, and that an other option like a CRL can be tried.
> > With "unknown", it doesn't say anything about being valid or not.
> >
> > I don't think that interpretation is very useful. I think that the OCSP
> server should know about the certificate before the customer has
> > the certificate. I think that if you have a properly signed certificate
> within it's validity period, the OCSP should always return either
> > "good" or "revoked", never 

Re: misissued.com FYI

2019-01-28 Thread Eric Mill via dev-security-policy
Would you consider tossing the backup in a zip file in an S3 bucket or
something, and sharing a link for the record here, for others finding this
in the future?

On Mon, Jan 28, 2019 at 10:05 AM Alex Gaynor via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Hi All,
>
> For anyone using https://misissued.com/ I wanted to provide a quick FYI
> about some database maintenance. The database was nearing its storage
> capacity limit, and so I deleted all certificates from it that had expired
> before 2019. The main consequence of this is that you can't use
> misissued.com as a complete historical record anymore.
>
> I captured a database backup before doing this, so if anyone does want that
> data, it hasn't been completely lost.
>
> Cheers,
> Alex
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>


-- 
Eric Mill
617-314-0966 | konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: CA disclosure of revocations that exceed 5 days [Was: Re: Incident report D-TRUST: syntax error in one tls certificate]

2018-12-05 Thread Eric Mill via dev-security-policy
On Wed, Dec 5, 2018 at 2:36 AM Fotis Loukos via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 4/12/18 8:30 μ.μ., Ryan Sleevi via dev-security-policy wrote:
> > On Tue, Dec 4, 2018 at 5:02 AM Fotis Loukos <
> me+mozdevsecpol...@fotisl.com>
>
> As far as I can tell, if no quantifiers are used in a proposition
> written in the English language, then it is assumed to be a universal
> proposition. If it were particular, then sentences such as "numbers are
> bigger than 10" and "cars are blue" would be true, since there are some
> numbers bigger than 10 and there are some cars that are blue. My
> knowledge of the inner workings of the English grammar is not that good,
> but at least this is what applies in Greek and in cs/logic (check
> http://www.cs.colostate.edu/~cs122/.Fall14/tutorials/tut_2.php for
> example). If I am mistaken, then it was error on my side.
>

Formally, yes, but in practice, there is ambiguity. For example, you can
say "elderly people vote for X political party", and it doesn't have to
mean that 100.0% of elderly people vote for that party for that to be a
reasonably accurate statement, if by and large that population has a clear
trend.

That's not to agree or disagree with Ryan's statement, just noting that
people do necessarily have to characterize groups sometimes, and that any
characterization of a large enough group will usually not apply to all of
its members.

I know I personally belong to a number of demographic groups whose behavior
as a group doesn't match mine as an individual, and when people criticize
those demographic groups, I try not to take it as a personal attack.

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Incident report D-TRUST: syntax error in one tls certificate

2018-12-01 Thread Eric Mill via dev-security-policy
On Wed, Nov 28, 2018 at 4:41 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 27/11/2018 00:54, Ryan Sleevi wrote:
> > On Mon, Nov 26, 2018 at 12:12 PM Jakob Bohm via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> >
> >> 2. Being critical from a society perspective (e.g. being the contact
> >> point for a service to help protect the planet), doesn't mean that
> the
> >> people running such a service can be expected to be IT superstars
> >> capable of dealing with complex IT issues such as unscheduled
> >> certificate replacement due to no fault of their own.
> >>
> >
> > That sounds like an operational risk the site (knowingly) took. Solutions
> > for automation exist, as do concepts such as "hiring multiple people"
> > (having a NOC/SOC). I see nothing to argue that a single person is
> somehow
> > the risk here.
> >
>
> The number of people in the world who can do this is substantially
> smaller than the number of sites that might need them.  We must
> therefore, by necessity, accept that some such sites will not hire such
> people, or worse multiple such people for their own exclusive use.
>
> Automating certificate deployment (as you often suggest) lowers
> operational security, as it necessarily grants read/write access to
> the certificate data (including private key) to an automated, online,
> unsupervised system.
>

Respectfully, this isn't accurate. Automated certificate deployment and
rotation is a best practice for high-functioning enterprises, and can be
done without exposing general read/write access to other systems. I've seen
automated certificate rotation implemented in several federal government
agencies, and (maybe more importantly) have seen many more agencies let
their certificates expire and impact the security of public services due to
a lack of automation.

Nick already described how the ACME protocol can be automated without
exposing the TLS private key, but more generally, organizations can use
scoped permissioning to grant individual components only the specific
access they need to accomplish their job. As an example, customers of
Amazon Web Services can use the IAM permissions framework to establish
granular permissions that mitigate the impact of component compromise.
Enterprises relying on self-managed infrastructure are free to implement a
similar system.

For a government example of automated certificate issuance, see
https://cloud.gov/docs/services/cdn-route/, which is a FedRAMPed service
whose security authorization is signed off on by the Departments of Defense
and Homeland Security.

Societally important organizations who don't specialize in technology
(which is most of them), or for whatever reason can't feasibly automate
their certificate operations, should definitely be relying on
infrastructure managed by third parties which do specialize in this
technology, be it basic site hosting like Squarespace or more sophisticated
cloud services.

In other words, no organization has an excuse to not be able to rotate a
certificate given 5 days' notice. The fact that many large organizations
continue to have a problem with this doesn't make it any more excusable.

-- Eric


> Allowing multiple persons to replace the certificates also lowers
> operational security, as it (by definition) grants multiple persons
> read/write access to the certificate data.
>
> Under the current and past CA model, certificate and private key
> replacement is a rare (once/2 years) operation that can be done
> manually and scheduled weeks in advance, except for unexpected
> failures (such as a CA messing up).
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: How harsh (in general) should Mozilla be towards CAs?

2018-11-09 Thread Eric Mill via dev-security-policy
On Thu, Nov 8, 2018 at 8:51 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Over the years, there has been some variation among participants in how
> harshly individual mistakes by CAs should be judged, ranging from "just
> file a satisfactory incident report, and all will be fine" to "Any tiny
> mistake could legally be construed as violating a formal requirement
> that would be much more catastrophic under other circumstances,
> therefore the maximum penalty of immediate distrust must be imposed".
>

This doesn't seem like an accurate description of the debates within the
Mozilla CA program, or this list, at all. I've never heard anyone make an
assertion that sounds like either extreme.

The long-term participants here, including those who press CAs hard, have
all responded very positively to a timely, detailed incident reports that
properly demonstrate an understanding and addressing of root cause.

There have definitely been quite a few CAs who have had incident reports
dragged out of them, or filed incident reports that addressed surface level
issues without any seeming acknowledgment of the gravity of the issue.

Where incidents with little _immediate_ security impact have occurred (such
as certain kinds of spec non-conformity), they have typically become major
issues not on the depth of perceived impact, but when there is a failure to
acknowledge that poor responses to small issues are highly predictive of
future large issues, or a long-term pattern that demonstrates this
empirically.

The major distrust events of the last few years have all been preceded by
robust discussion and demonstration of long-term issues, and months or
years of poor communication with the community.

In other words, no one has been tossed on a technicality, and I've never
seen any regular member of the community advocate for tossing someone
solely on a technicality.


> Furthermore, people with some clout tend to shut down all
> counterarguments when taking either extreme position, creating situation
> there only their own position is heard, making the entire "community"
> aspect an illusion.
>

This isn't my experience at all. Contributions from community members are
certainly distributed unevenly, but that seems to correspond most closely
to folks for whom participation here is part of their day job. That would
particularly be true for those who have spent years engaging in oversight
of a shifting array of CAs. And since the Mozilla CA Program itself is a CA
oversight program, those members have a very credible claim to represent
the community, even if others don't always have the time or mandate to
devote time to articulating the same arguments.

In general, I don't believe this post is well-grounded in fact, and
presents an inaccurate view of the Mozilla CA program's history. As a
result, I don't think it's likely to produce a constructive discussion.

-- Eric

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Visa Issues

2018-09-28 Thread Eric Mill via dev-security-policy
On Thu, Sep 27, 2018 at 5:22 PM Wayne Thayer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Visa has filed a bug [1] requesting removal of the eCommerce root from the
> Mozilla root store. Visa has also responded to the information requested in
> the qualified audits bug [2], but it's unclear if or when they will respond
> to the issues list presented in this thread. Two weeks have passed since I
> posted the issues list, and I see no reason to delay the complete distrust
> of Visa's eCommerce root. That is likely to happen in Firefox 64 [3] via
> removal of the root from NSS version 3.40 . Visa is still welcome to
> respond to the issues list, but I think the removal of Visa's only included
> root, and thus Visa, from the Mozilla CA Certificate Program implies that
> this discussion has reached a conclusion.
>

Visa also stated in their removal bug:

"Visa’s plan is to remove the SHA1 root and introduce a new SHA2 and ECC
root."

Were Visa to apply to the Mozilla program with one or more new roots, would
those be new discussions, or would that cause this discussion about Visa's
history of issues to be re-opened?

-- Eric


>
> - Wayne
>
> [1] https://bugzilla.mozilla.org/show_bug.cgi?id=1493822
> [2] https://bugzilla.mozilla.org/show_bug.cgi?id=1485851#c2
> [3] https://wiki.mozilla.org/Release_Management/Calendar
>
> On Sun, Sep 23, 2018 at 1:15 PM Ryan Sleevi  wrote:
>
> >
> >
> > On Thu, Sep 13, 2018 at 3:26 PM Wayne Thayer via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> >
> >> Visa recently delivered new qualified audit reports for their eCommerce
> >> Root that is included in the Mozilla program. I opened a bug [1] and
> >> requested an incident report from Visa.
> >>
> >> Visa was also the subject of a thread [2] earlier this year in which I
> >> stated that I would look into some of the concerns that were raised.
> I've
> >> done that and have compiled the following issues list:
> >>
> >> https://wiki.mozilla.org/CA:Visa_Issues
> >>
> >> While I have attempted to make this list as complete, accurate, and
> >> factual
> >> as possible, it may be updated as more information is received from Visa
> >> and the community.
> >>
> >> I would like to request that a representative from Visa engage in this
> >> discussion and provide responses to these issues.
> >>
> >> - Wayne
> >>
> >> [1] https://bugzilla.mozilla.org/show_bug.cgi?id=1485851
> >> [2]
> >>
> >>
> https://groups.google.com/d/msg/mozilla.dev.security.policy/NNV3zvX43vE/ns8UUwp8BgAJ
> >
> >
> > I've not seen Visa engage in this discussion. The silence is rather
> > deafening, and arguably unacceptably so.
> >
> > With respect to the Qualified Audit, Visa's response as to the substance
> > of the issue is particularly unsettling.
> > https://bugzilla.mozilla.org/show_bug.cgi?id=1485851#c3 demonstrates
> that
> > they've not actually remediated the qualification, that they've further
> > failed to meet the BRs requirements on revocations by any reasonable
> > perspective, and they don't even have a plan yet to remedy this issue.
> >
> > Examining the bug itself is fairly disturbing, and the responses likely
> > reveal further BR violations. For example, the inability to obtain
> evidence
> > of domain validation information reveals that there are further issues
> with
> > 2-7.3 - namely, maintaining those logs for 7 years. The response to 2-7.3
> > suggests that there are likely more endemic issues around the issuance.
> >
> > Given the past issues, the recently identified issues (that appear to
> have
> > been longstanding), and the new issues that Visa's PKI Policy team is
> > actively engaging in, I believe it would be appropriate and necessary to
> > consider removing trust in this CA.
> >
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>


-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DEFCON Talk - Lost and Found Certificates

2018-08-19 Thread Eric Mill via dev-security-policy
On Sun, Aug 19, 2018 at 3:56 PM Eric Mill  wrote:

> On Thu, Aug 16, 2018 at 6:52 PM Jakob Bohm via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>>   - While infinitely wealthy organizations can afford getting separate
>>certificates for each DNS name, and while lowest-security (DV)
>>certificates are now available for zero dollars in the US, SANs remain
>>significant in case of high security validation (OV, EV) that costs
>>real money and effort, both to pay the CA and to provide evidence of
>>human and organizational genuineness, such as showing government IDs,
>>obtaining certified copies of registration statements, answering
>>validation phone calls to CEOs at strange hours etc.
>>
>
> DV certificates are appropriate for even the largest of organizations, and
> are likely to supplant OV/EV certificates over time. For an example by one
> of the largest enterprises in the world, see the U.S. Department of
> Defense's policy changes to allow and encourage the use of DV certificates
> throughout its public-facing infrastructure, and their public commitment to
> Congress to use this policy change to complete their public HTTPS-only
> transition by the end of 2018:
>
>
> https://www.wyden.senate.gov/imo/media/doc/wyden-web-encryption-letter-to-dod-cio.pdf
>

Wrong URL on my part - that was the letter to the Department of Defense,
and this is the letter they responded with describing their approval of DV
certificates and their plans in 2018 and beyond:

https://www.wyden.senate.gov/imo/media/doc/Wyden%20-%20DoD%20Web%20Services%20-%20Best%20Practices%20(Jul%2020%202018).pdf

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DEFCON Talk - Lost and Found Certificates

2018-08-19 Thread Eric Mill via dev-security-policy
On Thu, Aug 16, 2018 at 6:52 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> It seems that my response to this presentation has brought out the crowd
> of people who are constantly looking to reduce the usefulness of
> certificates to anyone but the largest mega-corporations.
>
> To summarize my problem with this:
>
>   - While some large IT operations (and a minority of small ones) run
>fully automated setups that can trivially handle replacing
>certificates many times per year, many other certificate holders treat
>certificate replacement as a rare event that involves a lot of manual
>labor.  Shortening the maximum duration of certificates down to Let's
>encrypt levels will be a massive burden in terms of wasted man-hours
>accumulated over millions (billions?) of organizations having to do 4
>times a year what they used to do every two or five years.
>

The trend is away from manual replacement, not towards it -- and that's
true for individual people, for large enterprises, and for smaller
companies in between. For individuals and smaller enterprises, this
manifests mostly in the increasing outsourcing of certificate management to
third parties (e.g. SquareSpace, CloudFlare, AWS Certificate Manager,
etc.).

For larger enterprises, the same outsourcing is also present and is
mitigating manual rotation burdens, but some are also investing in their
own systems for automation inside their environments. I've seen several
spring up in enterprise environments I'm close to in the last few years in
order to handle the increasing pressure to secure connections by default
even when the certificate volume is high.

Reducing certificate lifetimes to 13 months, in addition to addressing the
real security issue identified by the Lost and Found Certificates
presentation, is likely to further these trends, which would be a positive
development both for user security and enterprise agility.

  - While infinitely wealthy organizations can afford getting separate
>certificates for each DNS name, and while lowest-security (DV)
>certificates are now available for zero dollars in the US, SANs remain
>significant in case of high security validation (OV, EV) that costs
>real money and effort, both to pay the CA and to provide evidence of
>human and organizational genuineness, such as showing government IDs,
>obtaining certified copies of registration statements, answering
>validation phone calls to CEOs at strange hours etc.
>

DV certificates are appropriate for even the largest of organizations, and
are likely to supplant OV/EV certificates over time. For an example by one
of the largest enterprises in the world, see the U.S. Department of
Defense's policy changes to allow and encourage the use of DV certificates
throughout its public-facing infrastructure, and their public commitment to
Congress to use this policy change to complete their public HTTPS-only
transition by the end of 2018:

https://www.wyden.senate.gov/imo/media/doc/wyden-web-encryption-letter-to-dod-cio.pdf

>
> Off topic notes related to this thread:
>
>   - It is bad form to reply to posts with a personal e-mail cc-ed to the
>mailing list unless explicitly requested by the original poster.
>

So you're aware, this is the default behavior of "Reply All" for this list,
at least in Gmail. If this creates a particular hassle for people, I can
personally try to remember to remove their emails when replying to the list
-- but I think the only practical way to address this would be to modify
the list settings in some way, rather than ask for changes from individual
posters.

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DEFCON Talk - Lost and Found Certificates

2018-08-16 Thread Eric Mill via dev-security-policy
On Wed, Aug 15, 2018 at 6:36 AM Wayne Thayer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> I'd like to call this presentation to everyone's attention:
>
> Title: Lost and Found Certificates: dealing with residual certificates for
> pre-owned domains
>
> Slide deck:
>
> https://media.defcon.org/DEF%20CON%2026/DEF%20CON%2026%20presentations/DEFCON-26-Foster-and-Ayrey-Lost-and-Found-Certs-residual-certs-for-pre-owned-domains.pdf
>
> (NOTE: this PDF loads in Firefox, but not in Safari and not, I'm told, in
> Chrome's native PDF viewer).
>
> Demo website: https://insecure.design/
>
> The basic idea here is that domain names regularly change owners, creating
> "residual certificates" controlled by the previous owner that can be used
> for MITM. When a bunch of unrelated websites are thrown into the same
> certificate by a service provider (e.g. CDN), then this also creates the
> opportunity to DoS the sites by asking the CA to revoke the certificate.
>
> The deck includes some recommendations for CAs.
>
> What, if anything, should we do about this issue?
>

I think this paper provides a good impetus to look at further shortening
certificate lifetimes down to 13 months. That would better match the annual
cadence of domain registration so that there's a smaller window of time
beyond domain expiration for which a certificate would be valid, and would
continue the momentum Mozilla and the CA/B Forum have been building around
reducing certificate lifetimes and encouraging automation.

The presentation suggests having certificates only be valid through the
expiration date of the relevant registered domain, but I think that's
unrealistic. Most of the time, domains are set to autorenew so that people
never have to think about them, and their renewal cadence is totally
disconnected from certificate renewal cadence. If a domain is 6 days from
autorenew, a CA offering a 6-day-long cert and forcing someone to come back
a week later for another one would be very unreasonable.

I don't think the presentation points to building in stronger support for
revocation. If anything, it points to revocation being a threat vector for
DoS-ing sites that have nothing to do with the problem at hand, due to the
long-standing (and reasonable) practice of multi-SAN certs that combine
clumps of customers into individual certificates. Ryan points out that SNI
is becoming something that can be relied on more universally, which would
reduce the need for multi-SAN certificates, but multi-SAN certificates also
provide useful operational benefits to organizations who are using CAs with
rate limits, or simply for whom the ability to use 100x fewer certificates
relieves an operational scaling burden.

It may still be useful to deprecate multi-SAN certificates over time, but I
think the single biggest thing to take away from the presentation is that
long-lived certs create invisible risks during domain transfers, and that
the risk is more than just theoretical when looking at the whole of the
web. It's been a year and a half now since the last discussion and vote
that went from a 39-month max to a 27-month max, so I think it's a great
time to start talking about a 13-month maximum.

-- Eric



> - Wayne
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>


-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Clarification about Ballot 193 and validity of a certificate issued on March 1st

2018-04-22 Thread Eric Mill via dev-security-policy
On Sun, Apr 22, 2018 at 5:01 PM, Rob Stradling <r...@comodoca.com> wrote:

> On 22/04/18 21:04, Eric Mill via dev-security-policy wrote:
>
>> On Fri, Apr 20, 2018 at 9:30 AM, Tim Shirley via dev-security-policy <
>> dev-security-policy@lists.mozilla.org> wrote:
>>
>> First of all, it's important to distinguish between the BR r
>>> But even if you accept my premise there, then you have to ask "in what
>>> timezone?"  March 1 00:00:00 2018 GMT in North America is February 28.
>>> So
>>> I could see someone making the argument that issuance at that moment in
>>> time is fine if the CA is in North America but it's mis-issuance if the
>>> CA
>>> is in Europe, since the requirements don't state that the measurement is
>>> UTC.  This is why I'm not a fan of such precise enforcements of
>>> date-related compliance.  There are a lot of different ways to interpret
>>> dates/times, but none of the readings materially change the net effect of
>>> the rule.  That is, all readings change the max validity period to ~825
>>> days (which itself is subject to debate as to its precise meaning in
>>> terms
>>> of seconds) within a day or two of each other.  So, enforcing the date as
>>> Mar 1 as opposed to Mar 2 doesn't seem to add a lot of value and leads to
>>> confusion like this.
>>>
>>
>> I'm just going to double down on Matt's comment that the problem here
>> doesn't seem to be in strictness of enforcement, but rather CAs leaving
>> themselves no buffer zone.
>>
>
> The problem here, IMHO, is that the BR requirement was poorly written.
>
> Whatever business advantage there is of giving
>> customers that one last day to get 3-year certs, seems likely not as
>> valuable as the certainty of avoiding giving those customers errors when
>> the certs are used in major browsers.
>>
>
> The certainty?  Hindsight is a wonderful thing.  When I wrote Comodo CA's
> code to enforce the "after 1 March 2018" rule, this "certainty" did no
> occur to me.  I simply read the BR requirement and then implemented code to
> enforce it.


Yeah, I completely get how that would happen. I would just think this is a
good learning opportunity to protect against ambiguously written text by
giving a day's buffer.

Tim's time zone example is another good reason to give that buffer, even if
the BR language made it clear whether it was > or >=. A tangentially
similar comparison would be that in other systems I've built that structure
search results and push notifications around dates, the only safe way to do
it is to assign times as 12:00 UTC, even if that doesn't really accurately
describe the time something happened, so that no matter what time zone
someone is in, they're guaranteed to see it as the same day. It's worth the
imprecision to create consistency.


>
>
> -- Eric
>>
>>
>>
>>> On 4/19/18, 10:10 PM, "dev-security-policy on behalf of Simone Carletti
>>> via dev-security-policy" <dev-security-policy-bounces+tshirley=
>>> trustwave@lists.mozilla.org on behalf of dev-security-policy@lists.
>>> mozilla.org> wrote:
>>>
>>>  Hello,
>>>
>>>  I'm investigating an issue on behalf of a customer. Our customer
>>> requested a multi-year certificate that was issued on March 1st by
>>> Comodo.
>>>
>>>  Here's the certificate:
>>>  https://crt.sh?id=354042595
>>>
>>>  Validity
>>>  Not Before: Mar  1 00:00:00 2018 GMT
>>>  Not After : May 29 23:59:59 2021 GMT
>>>
>>>  The certificate is currently considered invalid at least by Google
>>> Chrome.
>>>
>>>  It's my understanding that Google Chrome uses a >= comparison, which
>>> effectively means certificates issued on March 1st are already subject to
>>> Ballot 193.
>>>
>>>  However, it looks like the interpretation of Comodo of Ballot 193
>>> here
>>> is based on a > comparison, since the certificate was issued with a 3y
>>> validity.
>>>
>>>  BR 6.3.2 says:
>>>
>>>  > Subscriber Certificates issued after 1 March 2018 MUST have a
>>> Validity Period no greater than 825 days.
>>>  > Subscriber Certificates issued after 1 July 2016 but prior to 1
>>> March 2018 MUST have a Validity Period no greater than 39 months.
>>>
>>>  I'd appreciate some hints about whether a certificate issued on
>>> March
>>> 1st should be considered subject to Ballot 193 or not.
>>>
>>>  Best,
>>>  -- Simone
>>>
>>
> --
> Rob Stradling
> Senior Research & Development Scientist
> Email: r...@comodoca.com
>



-- 
konklone.com | @konklone <https://twitter.com/konklone>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Clarification about Ballot 193 and validity of a certificate issued on March 1st

2018-04-22 Thread Eric Mill via dev-security-policy
On Fri, Apr 20, 2018 at 9:30 AM, Tim Shirley via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> First of all, it's important to distinguish between the BR r
> But even if you accept my premise there, then you have to ask "in what
> timezone?"  March 1 00:00:00 2018 GMT in North America is February 28.  So
> I could see someone making the argument that issuance at that moment in
> time is fine if the CA is in North America but it's mis-issuance if the CA
> is in Europe, since the requirements don't state that the measurement is
> UTC.  This is why I'm not a fan of such precise enforcements of
> date-related compliance.  There are a lot of different ways to interpret
> dates/times, but none of the readings materially change the net effect of
> the rule.  That is, all readings change the max validity period to ~825
> days (which itself is subject to debate as to its precise meaning in terms
> of seconds) within a day or two of each other.  So, enforcing the date as
> Mar 1 as opposed to Mar 2 doesn't seem to add a lot of value and leads to
> confusion like this.
>

I'm just going to double down on Matt's comment that the problem here
doesn't seem to be in strictness of enforcement, but rather CAs leaving
themselves no buffer zone. Whatever business advantage there is of giving
customers that one last day to get 3-year certs, seems likely not as
valuable as the certainty of avoiding giving those customers errors when
the certs are used in major browsers.

-- Eric


>
> On 4/19/18, 10:10 PM, "dev-security-policy on behalf of Simone Carletti
> via dev-security-policy"  trustwave@lists.mozilla.org on behalf of dev-security-policy@lists.
> mozilla.org> wrote:
>
> Hello,
>
> I'm investigating an issue on behalf of a customer. Our customer
> requested a multi-year certificate that was issued on March 1st by Comodo.
>
> Here's the certificate:
> https://crt.sh?id=354042595
>
> Validity
> Not Before: Mar  1 00:00:00 2018 GMT
> Not After : May 29 23:59:59 2021 GMT
>
> The certificate is currently considered invalid at least by Google
> Chrome.
>
> It's my understanding that Google Chrome uses a >= comparison, which
> effectively means certificates issued on March 1st are already subject to
> Ballot 193.
>
> However, it looks like the interpretation of Comodo of Ballot 193 here
> is based on a > comparison, since the certificate was issued with a 3y
> validity.
>
> BR 6.3.2 says:
>
> > Subscriber Certificates issued after 1 March 2018 MUST have a
> Validity Period no greater than 825 days.
> > Subscriber Certificates issued after 1 July 2016 but prior to 1
> March 2018 MUST have a Validity Period no greater than 39 months.
>
> I'd appreciate some hints about whether a certificate issued on March
> 1st should be considered subject to Ballot 193 or not.
>
> Best,
> -- Simone
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://scanmail.trustwave.com/?c=4062=p8zZ2rF2lZEEgQKoVUUviom_
> gMvUa93578dYFlK0UQ=5=https%3a%2f%2flists%2emozilla%
> 2eorg%2flistinfo%2fdev-security-policy
>
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread Eric Mill via dev-security-policy
On Thu, Apr 12, 2018 at 2:57 PM, Eric Mill  wrote:
>
>
> Of course, that would break his proof-of-concept exploit.  Which is the
>> right outcome.  It demonstrates that an EV certificate used in a manner
>> which might cause confusion will be revoked.  They're not stopping him from
>> publishing.  He can still do that, without the benefit of an EV certificate.
>>
>
> The stripe.ian.sh site itself is not likely to cause confusion, and was
> not an exploit. Here's what stripe.ian.sh looks like right now:
>

(Inline images don't appear to play too well with m.d.s.p, so I've attached
the image to this email.)

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread Eric Mill via dev-security-policy
On Thu, Apr 12, 2018 at 2:41 PM, Matthew Hardeman 
wrote:

>
>
> On Thu, Apr 12, 2018 at 1:24 PM, Eric Mill  wrote:
>
>> Ian's intent may have been to demonstrate EV's weaknesses, but that
>> doesn't mean Ian was intending to deceive users. If Ian had used this to
>> try to get people to enter their Stripe credentials or something, then
>> that'd be one thing. But registering an LLC and then creating a cert for it
>> is a legitimate activity.
>>
>>
> Except that Ian intended to demonstrate that he could receive and maintain
> a valid EV certificate to be utilized in a manner which may deceive users.
> Not deceive with lies, but deceive in terms of buck their expectations.
>

But he did not deceive users. Demonstrating that this is possible is not
itself an act of deception.

As it is, this effectively censors Ian's website where he is making a
>> statement about how EV works and how it interacts with
>> trademark/registration laws, through his own registered business. That
>> statement is -- and I'm being serious -- being oppressed, based on a
>> capricious decision by a CA.
>>
>
> The only sense in which it censors his website is that he doesn't
> presently have an EV certificate on it.  If he wants it to be available to
> the public again, he can get a DV certificate for it any time.
>

No, this act took his website down immediately for reasons related to its
statement (rather than any deceptive actions). That's censorship, even if
options exist to work around this censorship. If his registrar had disabled
his DNS, would it have been okay to describe that as "well, he can just get
another registrar who doesn't think his site is deceptive! Or he can just
use an IP address!". No, that would have been a Big Deal.

Of course, that would break his proof-of-concept exploit.  Which is the
> right outcome.  It demonstrates that an EV certificate used in a manner
> which might cause confusion will be revoked.  They're not stopping him from
> publishing.  He can still do that, without the benefit of an EV certificate.
>

The stripe.ian.sh site itself is not likely to cause confusion, and was not
an exploit. Here's what stripe.ian.sh looks like right now:



This is not going to confuse anyone into thinking they're interacting with
the payment processing company. Stripe, LLC, the Kentucky-registered
company owned by Ian Carroll, is perfectly free to publish the statement
above. If the payment processing company objects, their appropriate method
of redress in the US is through the judicial system, or other
government-designed arbitration processes.


> Ian is now not able to maintain this public demonstration on the internet
>> in any browser (including Chrome, since it's EV), despite having committed
>> no crimes, not having engaged in any malicious behavior, and not harmed any
>> users.
>>
>
> He could always just use a DV certificate, but then he wouldn't be able to
> drag along GoDaddy's endorsement and attach it to his particular exercise
> of free speech to which GoDaddy apparently objects.
>

GoDaddy issuing an EV certificate can't be construed as endorsing the
speech on that website (and I am sure GoDaddy's lawyers would agree with
me!). GoDaddy would hardly be able to issue many EV certificates at all if
they were constantly expected to be endorsing the website contents of those
who receive them.

But the last part of your sentence is correct: GoDaddy apparently objects
to Ian's particular exercise of free speech. And that's the problem.

-- Eric

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread Eric Mill via dev-security-policy
On Thu, Apr 12, 2018 at 1:03 PM, Wayne Thayer  wrote:

> On Thu, Apr 12, 2018 at 9:45 AM, Ryan Sleevi  wrote:
>
>>
>> In what way is it misleading though? It fully identified the organization
>> that exists, which is a legitimate organization. Thus, the information that
>> appears within the certificate itself is not misleading - and I don't think
>> 4.9.1.1 applies.
>>
>> I would refer you to your email, kicking off the 150+ message thread on
> this topic back in December, that included these statements:
>
> "...and more importantly, how easy it is to obtain certificates that may
> confuse or mislead users"
> "given the ability to provide accurate-but-misleading information in EV
> certificates,..."
>
> https://groups.google.com/d/msg/mozilla.dev.security.policy/szD2KBHfwl8/
> kWLDMfPhBgAJ
>

Ryan is allowed to change his mind on whether this should be considered
misleading. But either way, I do not believe either was misleading.

Ian's intent may have been to demonstrate EV's weaknesses, but that doesn't
mean Ian was intending to deceive users. If Ian had used this to try to get
people to enter their Stripe credentials or something, then that'd be one
thing. But registering an LLC and then creating a cert for it is a
legitimate activity.

If Ian shouldn't have been allowed to register this business, then that's
something the state/country he registered the business in should express
through laws or adjudication of the registration. The rules and criteria
for those processes are established in many countries through a process at
least nominally responsive to public values.

As it is, this effectively censors Ian's website where he is making a
statement about how EV works and how it interacts with
trademark/registration laws, through his own registered business. That
statement is -- and I'm being serious -- being oppressed, based on a
capricious decision by a CA.

Ian is now not able to maintain this public demonstration on the internet
in any browser (including Chrome, since it's EV), despite having committed
no crimes, not having engaged in any malicious behavior, and not harmed any
users.

That's not the kind of outcome I understand to be consistent with Mozilla's
values and commitment to an open web. I'm fine being told that it's not
fair to come down on any one CA right now, since it's happened a few times
and many folks have considered this normal. But I don't think this is
something Mozilla should continue to consider as normal business practices.

-- Eric

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread Eric Mill via dev-security-policy
On Thu, Apr 12, 2018 at 12:32 PM, Wayne Thayer <wtha...@mozilla.com> wrote:

> On Thu, Apr 12, 2018 at 8:10 AM, Ryan Sleevi via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>> Indeed, I find it concerning that several CAs were more than happy to take
>> Ian's money for the issuance, but then determined (without apparent cause
>> or evidence) to revoke the certificate. Is there any evidence that this
>> certificate was misissued - that the information was not correct? Is there
>> evidence that Ian, as Subscriber, or stripe.ian.sh, as domain holder,
>> requested this certificate to be revoked?
>>
>> If anything, this highlights the deeply concerning practices of revocation
>> by CAs, and their ability to disrupt services of legitimate businesses.
>>
>> BR 4.9.1.1 states that a CA SHALL revoke a certificate within 24 hours if
> "The CA determines that any of the information appearing in the
> Certificate is inaccurate or misleading" I'm sympathetic to the arguments
> being made here, but the whole point of this discussion is that the EV
> information presented to users is misleading, so these CAs did what was
> required of them.
>

That's not accurate -- the EV information presented to users was not
misleading. It correctly described Ian's registered company. The
certificate was incorrectly revoked. We should probably be discussing
whether punitive measures are appropriate for this revocation.

-- Eric


>
> On Thu, Apr 12, 2018 at 10:20 AM, Eric Mill via dev-security-policy <
>> dev-security-policy@lists.mozilla.org> wrote:
>>
>> > I'll go further, and protest why the EV cert was revoked. Why can't Ian
>> > have a "Stripe, Inc." EV certificate for his business if he wants to?
>> What
>> > makes the payment processing company somehow more deserving of one than
>> > Ian's company? Why was GoDaddy allowed to effectively take Ian's site
>> down
>> > without his consent?
>> >
>> > If this is how EV is going to be handled, I think it's time to seriously
>> > discuss removing the display of EV information from Mozilla products.
>> >
>> > -- Eric
>> >
>> > On Wed, Apr 11, 2018 at 3:31 PM, Jonathan Rudenberg via
>> dev-security-policy
>> > <dev-security-policy@lists.mozilla.org> wrote:
>> >
>> > > On Wed, Apr 11, 2018, at 15:27, Matthew Hardeman via
>> dev-security-policy
>> > > wrote:
>> > > > It was injudicious of a CA to issue another certificate in this name
>> > for
>> > > > this entity after the already well documented controversy.  Did they
>> > just
>> > > > not care that it would invite trouble or did they not know that it
>> > would
>> > > > invite controversy and trouble because they didn't track it the
>> first
>> > > time
>> > > > around?
>> > >
>> > > What "trouble" is being invited? I don't see a problem. Everything is
>> > > operating exactly as expected. GoDaddy did nothing wrong.
>>
>>
>


-- 
konklone.com | @konklone <https://twitter.com/konklone>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread Eric Mill via dev-security-policy
It's not clear that end users pay any attention to EV UI, or properly
understand what they're looking at. It's especially unclear whether, if a
user went to a site that was *lacking* EV but just had a DV/OV UI, that the
user would notice anything at all.

That's the status quo. This incident makes it more clear that even if we
invested more in EV UI in some way, it would only exacerbate a capricious
dynamic where CAs are responsible for deciding which brands and companies
are more important than others, and use arbitrary and undefined criteria to
decide whether a legitimate web service and registered business entity will
suffer immediate downtime.

Fortunately, because so few users make decisions based on EV UI, it's also
not clear Mozilla would suffer much in the way of first-mover disadvantage
by removing it. Users choose what browsers they use, not CAs, and the loss
of EV UI seems unlikely to generate much in the way of users switching
their user agents.

-- Eric



On Thu, Apr 12, 2018 at 11:35 AM, Matthew Hardeman <mharde...@gmail.com>
wrote:

> As far as I've seen there's no notion of "shall issue" or "must issue" in
> any of the guidelines.
>
> In other words, it would appear that CAs are free to restrict issuance or
> restrict use and validity of EV certificates (or any other certificates,
> for that matter) if they so choose.
>
> Mr. Carroll may have a commercial dispute between himself or his entity
> and the CAs, but that's a routine commercial dispute.  It appears likely
> that the terms of engagement with most of the commercial CAs would grant
> the CA cover to revoke if they find the certificate or its use to be
> perverse to security or likely to cause risk, etc.
>
> Is there a censorship aspect there?  Perhaps.  As has been noted before,
> however, we're forced to tolerate that from Microsoft anyway.
>
> On Thu, Apr 12, 2018 at 10:10 AM, Ryan Sleevi <r...@sleevi.com> wrote:
>
>> Indeed, I find it concerning that several CAs were more than happy to
>> take Ian's money for the issuance, but then determined (without apparent
>> cause or evidence) to revoke the certificate. Is there any evidence that
>> this certificate was misissued - that the information was not correct? Is
>> there evidence that Ian, as Subscriber, or stripe.ian.sh, as domain
>> holder, requested this certificate to be revoked?
>>
>> If anything, this highlights the deeply concerning practices of
>> revocation by CAs, and their ability to disrupt services of legitimate
>> businesses.
>>
>> On Thu, Apr 12, 2018 at 10:20 AM, Eric Mill via dev-security-policy <
>> dev-security-policy@lists.mozilla.org> wrote:
>>
>>> I'll go further, and protest why the EV cert was revoked. Why can't Ian
>>> have a "Stripe, Inc." EV certificate for his business if he wants to?
>>> What
>>> makes the payment processing company somehow more deserving of one than
>>> Ian's company? Why was GoDaddy allowed to effectively take Ian's site
>>> down
>>> without his consent?
>>>
>>> If this is how EV is going to be handled, I think it's time to seriously
>>> discuss removing the display of EV information from Mozilla products.
>>>
>>> -- Eric
>>>
>>> On Wed, Apr 11, 2018 at 3:31 PM, Jonathan Rudenberg via
>>> dev-security-policy
>>> <dev-security-policy@lists.mozilla.org> wrote:
>>>
>>> > On Wed, Apr 11, 2018, at 15:27, Matthew Hardeman via
>>> dev-security-policy
>>> > wrote:
>>> > > It was injudicious of a CA to issue another certificate in this name
>>> for
>>> > > this entity after the already well documented controversy.  Did they
>>> just
>>> > > not care that it would invite trouble or did they not know that it
>>> would
>>> > > invite controversy and trouble because they didn't track it the first
>>> > time
>>> > > around?
>>> >
>>> > What "trouble" is being invited? I don't see a problem. Everything is
>>> > operating exactly as expected. GoDaddy did nothing wrong.
>>> > ___
>>> > dev-security-policy mailing list
>>> > dev-security-policy@lists.mozilla.org
>>> > https://lists.mozilla.org/listinfo/dev-security-policy
>>> >
>>>
>>>
>>>
>>> --
>>> konklone.com | @konklone <https://twitter.com/konklone>
>>> ___
>>> dev-security-policy mailing list
>>> dev-security-policy@lists.mozilla.org
>>> https://lists.mozilla.org/listinfo/dev-security-policy
>>>
>>
>>
>


-- 
konklone.com | @konklone <https://twitter.com/konklone>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread Eric Mill via dev-security-policy
I'll go further, and protest why the EV cert was revoked. Why can't Ian
have a "Stripe, Inc." EV certificate for his business if he wants to? What
makes the payment processing company somehow more deserving of one than
Ian's company? Why was GoDaddy allowed to effectively take Ian's site down
without his consent?

If this is how EV is going to be handled, I think it's time to seriously
discuss removing the display of EV information from Mozilla products.

-- Eric

On Wed, Apr 11, 2018 at 3:31 PM, Jonathan Rudenberg via dev-security-policy
 wrote:

> On Wed, Apr 11, 2018, at 15:27, Matthew Hardeman via dev-security-policy
> wrote:
> > It was injudicious of a CA to issue another certificate in this name for
> > this entity after the already well documented controversy.  Did they just
> > not care that it would invite trouble or did they not know that it would
> > invite controversy and trouble because they didn't track it the first
> time
> > around?
>
> What "trouble" is being invited? I don't see a problem. Everything is
> operating exactly as expected. GoDaddy did nothing wrong.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Discovering unlogged certificates in internet-wide scans

2018-04-01 Thread Eric Mill via dev-security-policy
Did you submit the ~25K unexpired unlogged certs to CT?

On Sat, Mar 31, 2018 at 6:14 PM, Tim Smith via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Hi MDSP,
>
> I went looking for corpuses of certificates that may not have been
> previously logged to CT and found some in the Rapid7 "More SSL" dataset,
> which captures certificates from their scans of non-HTTPS ports for
> TLS-speaking services.
>
> I wrote up some findings at
> http://blog.tim-smith.us/2018/03/moressl-spelunking/.
>
> A few highlights include:
> - of the ~10 million certificates in the corpus, about 20% had valid
> signatures and chained to roots included in the Mozilla trust store
> - about 50,000 of the 2 million trusted certificates had not previously
> been logged
> - about half of the novel certificates were unexpired
>
> There were interesting examples of unexpired, non-compliant, trusted
> certificates chaining to issuers including GoDaddy, NetLock, Logius, and
> Entrust. (I have not taken any action to inform issuers of these findings,
> other than this message and by publishing the certificates to CT logs.)
>
> I welcome any feedback or questions about the value of the approach and the
> findings.
>
> Thanks,
> Tim
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: TURKTRUST Non-compliance

2018-03-20 Thread Eric Mill via dev-security-policy
On Tue, Mar 20, 2018 at 3:43 PM, Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 20/03/2018 18:49, Ryan Sleevi wrote:
>
>> On Tue, Mar 20, 2018 at 1:30 PM, Jakob Bohm via dev-security-policy <
>> dev-security-policy@lists.mozilla.org> wrote:
>>
>>>
>>> Are you suggesting that the BRs be modified so a CA that has ceased
>>>
 issuance can obtain a clean audit report without meeting all current BR
 requirements?


 I am suggesting that we consider what policy should be applied to the
>>> (required!) capability of keeping revocation running for max cert
>>> lifetime after a CA ceases to operate.
>>>
>>>
>> The BRs already cover this. The point is that once a CA stops auditing,
>> there's an issue about ensuring conformance.
>>
>>
> Actually, they don't.  They have an empty placeholder section for wind
> down procedures.  Surely one could blindly apply the full BRs to the
> situation, which I am arguing against.
>

The BRs absolutely cover this. The empty placeholder section is there for
CAs to describe their specific wind-down procedures (the BRs are often used
as a starting point for CAs developing a CP, with the intent that CAs will
fill out each section with their specific controls), but that does not mean
that the BRs don't cover CAs that are winding down.

And the BRs *should* cover this, because all that matters to BR scope is
whether a CA is still technically capable of issuing certificates. Their
own stated commitment to no longer issuing certificates is immaterial.

I think it's not going to be productive to spend a lot of time on the list
debating whether or not a CA can opt out of full BR compliance by simply
saying "we're winding down and won't issue certificates anymore". From
Mozilla's perspective, any root in their trust stores needs to be held to
the same standard.

-- Eric

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: TURKTRUST Non-compliance

2018-03-16 Thread Eric Mill via dev-security-policy
In TurkTrust's 2016 email noting that they were suspending their TLS
certificate business, they noted it stemmed mainly from not being accepted
to all major root stores (Apple did not accept them).

Therefore, the sites using these certificates are not trusted by some major
client bases, which is likely why some of the few existing sites that have
TurkTrust certificates, such as http://www.enpos.com.tr and
http://www.turktrust.com.tr/tr/, do not redirect clients to HTTPS. This
lack of reliance on using the certificates for HTTPS reduces the impact to
Mozilla's users of suspending trust in the remaining certificates.

Even if this were not the case, I would agree and recommend prompt removal
of this explicitly unmaintained, unaudited hierarchy to protect Mozilla's
users. The above only makes it even more obviously the right decision.

-- Eric

On Fri, Mar 16, 2018 at 8:23 PM, Wayne Thayer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> TURKTRUST has the "TÜRKTRUST Elektronik Sertifika Hizmet Sağlayıcısı H5"
> root included in the Mozilla program with the 'websites' trust bit enabled
> (not EV). Crt.sh identifies one unexpired and unrevoked subordinate CA [1],
> and 13 unexpired end-entity certificates signed by this root [2]. The
> audits for this root are either already expired (based on audit date) or
> nearly expired (based on the ETSI certificate expiration date) [3] [4].
>
> TURKTRUST announced the suspension of their SSL business in 2016 [5].
>
> TURKTRUST failed to respond to the January 2018 CA Communication. After
> repeated attempts, they did respond to my emails and posted a statement in
> the bug [6] including the following:
>
> The strategic decision mentioned above is actually suspending all SSL
> > business supporting activities that incur direct costs for TURKTRUST,
> > including suspending the ETSI and BR audits or OV and EV SSL related
> > insurance policies. We have also ceased our investment and studies on CT
> > and CAA requirements for the time being that are actually mandatory
> > criteria set by the CA/Browser Forum.
> >
>
> TURKTRUST has chosen not to request removal of the root, but I believe this
> is a clear case in which prompt removal of the root is necessary.
>
> I would appreciate everyone's constructive input on what action should be
> taken.
>
> - Wayne
>
> [1] https://crt.sh/?Identity=%25=5766=expired
> [2] https://crt.sh/?Identity=%25=5767=expired
> 
> [3] https://bug1332435.bmoattachments.org/attachment.cgi?id=8828490
> [4]
> https://www.tuvit.de/fileadmin/Content/TUV_IT/zertifikate/en/6749UE_s.pdf
> 
> [5] https://cabforum.org/pipermail/public/2016-September/008475.html
> 
> [6] https://bugzilla.mozilla.org/show_bug.cgi?id=1439127
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Following up on Trustico: reseller practices and accountability

2018-03-11 Thread Eric Mill via dev-security-policy
Though I'm not a GlobalSign customer, I'm told that GlobalSign sent the
following email to its partner ecosystem:

Dear Partner,

In reaction to current events related to the private key exposure and mass
revocation by Trustico/Digicert, GlobalSign is reaching out to its trusted
Partners and Resellers to survey how they approach customer private key and
CSR generation. The most secure method is to generate the keys on the
server and then use the CSR when requesting the certificate. If you do
generate private keys for any of your customers outside of the web server
environment where the certificate will be hosted, we request that you stop
this practice immediately.

We ask that all Partners and Resellers complete the following short
questionnaire as soon as possible or by: Friday, March 16, 2018.

Compliance questionnaire : [REDACTED]
Note: Only one questionnaire needs to be completed per company.

Thank you in advance for your cooperation and attention to this important
topic.

Kind regards,
GlobalSign Security and Compliance


So it's nice to see that at least one CA is taking action on this topic
without being ordered to (that I'm aware of). Obviously not all resellers
are like Trustico and perform only a straight certificate pass-through, as
Ryan Sleevi pointed out, and key escrow is a necessary part of acting as a
host, or CDN, or other authorized representative.

But surely there is still some way to responsibly look through the
ecosystem for resellers that do not perform an authorized function that
requires key escrow. Are any other CAs willing to do something like
GlobalSign has done?

Also, it would be very helpful if GlobalSign could share some information
relating to the responses they get, even if they are aggregated or
anonymized.

-- Eric

On Sun, Mar 4, 2018 at 4:04 PM, Eric Mill  wrote:

> Last week, Trustico (a reseller, formerly for Symantec and now for Comodo)
> sent 23,000 private keys to DigiCert, to force their revocation. This
> showed that Trustico had been storing customer keys generated through one
> or more CSR/key generation forms on their website.
>
> Though Trustico disagrees, this appears to be a clear case of routine key
> compromise for subscribers who obtained their key from Trustico. The
> security of Trustico's systems, which are not audited or accountable to
> root program requirements, were storing large amounts of key material whose
> compromise could have led to the subsequent compromise of connections to
> tens of thousands of online services.
>
> It was also noted that Trustico was exposing key material to interception
> by a number of third parties through client-side JavaScript embeds, and
> that Trustico's website had functionality that allowed remote code
> execution as root on one of their web servers.
>
> These m.d.s.p threads document/link to those things:
>
> * https://groups.google.com/d/topic/mozilla.dev.security.
> policy/wxX4Yv0E3Mk/discussion
> * https://groups.google.com/d/topic/mozilla.dev.security.
> policy/BLvabFwcJqo/discussion
>
> As part of the second thread, Comodo noted:
>
> We also asked Trustico to cease offering any tools to generate and/or
> retain customer private keys.  They have complied with this request and
> have confirmed that they do not intend to offer any such tools again in the
> future.
>
>
> That is good to hear, but a "we won't do it again" response, if accepted
> by Comodo as sufficient, seems disproportionate to the severity of the
> issue, given Trustico's unfamiliarity with norms around private key
> management, and with basic security practices.
>
> It's also clear from the experience that rules of the road for resellers
> are unclear, and that accountability is limited. It seems possible, or
> likely, that other resellers may also be mishandling customer keys
>
> So, what would useful next steps be to improve security and accountability
> for resellers?
>
> One thought: Mozilla could ask CAs to obtain a written response from all
> contracted resellers about if/how they interact with customer key material,
> including the level of isolation/security given their key generation
> environment (if they have one), and whether any third-party JavaScript is
> given access to generated key material.
>
> Any other ideas?
>
> Also -- Comodo noted:
>
> Trustico have also confirmed to us that they were not, and are not, in
> possession of the private keys that correspond to any of the certificates
> that they have requested for their customers through Comodo CA.
>
>
> Since there appears to have been a significant overlap period, between the
> time Trustico switched to Comodo and when Trustico was asked by Comodo to
> cease key storage practices, it's a little hard to take at face value the
> assurance that Trustico was never in possession of any Comodo keys. It
> would be nice to hear something from Comodo about whether they've verified
> this in any more detail.
>
> -- Eric
>
> --
> konklone.com | @konklone 

Re: Following up on Trustico: reseller practices and accountability

2018-03-05 Thread Eric Mill via dev-security-policy
I think it probably makes more sense to focus on sensitive key material
than non-sensitive CSRs.

As a starting point, how reasonable would it be for CAs to question their
resellers, or to disseminate additional language to add to their reseller
agreements to prohibit non-transactional/ephemeral key storage?

-- Eric

On Mon, Mar 5, 2018 at 9:15 AM, Ryan Sleevi via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Considering that the Baseline Requirements explicitly acknowledge that the
> Applicant may delegate the obtaining of their certificates to a third-party
> (Applicant Representative), how would you propose that the applicant's
> agents (which, in a legal sense, is the name for their employees - that is,
> those with legal authority to represent the company) and resellers?
>
> What would stop  someone from offering a "CSR-as-a-service" in which they
> generate CSRs for users, and then users take that generated CSR to the CA?
> What role are you suggesting that the CA has to play in policing 'how' the
> CSR was generated - since a CSR is-a CSR is-a CSR?
>
> On Mon, Mar 5, 2018 at 8:26 AM, James Burton via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
> > Currently, resellers are allowed to submit CSRs on behalf of their
> > customers and as we've seen this is open to abuse. Maybe it's time to
> stop
> > this practice and restrict submission of CSRs to CA portals only.
> >
> > On Mon, Mar 5, 2018 at 12:51 PM, okaphone.elektronika--- via
> > dev-security-policy <dev-security-policy@lists.mozilla.org> wrote:
> >
> > > On Sunday, 4 March 2018 22:44:26 UTC+1, Paul Kehrer  wrote:
> > > > On March 4, 2018 at 5:06:41 PM, Eric Mill via dev-security-policy (
> > > > dev-security-policy@lists.mozilla.org) wrote:
> > > >
> > > > 
> > > >
> > > > It's also clear from the experience that rules of the road for
> > resellers
> > > > are unclear, and that accountability is limited. It seems possible,
> or
> > > > likely, that other resellers may also be mishandling customer keys
> > > >
> > > > So, what would useful next steps be to improve security and
> > > accountability
> > > > for resellers?
> > > >
> > > >
> > > > As you already suggested an official communication requesting
> > information
> > > > from the CAs about the way their reseller networks manage subscriber
> > key
> > > > material would be a good start. Eventually I think it's likely that
> > > > resellers need to be subject to some limited form of audit (maybe as
> > > > simplistic as a PCI self-assessment questionnaire?). While that
> doesn't
> > > > prevent bad behavior it would generate an evidence trail for
> > termination
> > > of
> > > > relationships with incompetent/malicious actors.
> > >
> > > I'm not sure that that would be reasonable. After all resellers can
> have
> > > resellers, and so on so where would that end? With the end user having
> to
> > > accept a "general license agreement"? And distrusting a reseller could
> > also
> > > be difficult.
> > >
> > > I think it will have to be/remain the responsibility of the CA to
> choose
> > > their reselllers in such a way that "not too many questions are being
> > > asked" about them.
> > >
> > >
> > > > Of course, CAs are likely to be reluctant to share a complete list of
> > > their
> > > > resellers since they probably consider that competitive information.
> > So,
> > > it
> > > > would be nice if we could just make it part of the CA's audits...
> > > >
> > > > One way to do that would be that the baseline requirements could be
> > > updated
> > > > to create a section defining requirements placed upon resellers
> > > (especially
> > > > around subscriber key management). This way CAs would be incentivized
> > to
> > > > manage their business relationships more carefully since their
> business
> > > > partners could cause them audit issues. This has some precedent since
> > in
> > > > the past some resellers acted as RAs and had their own subordinates
> --
> > a
> > > > practice that was terminated as they came under scrutiny and demands
> > for
> > > > audits.
> > > >
> > > > Mozilla, of course, cannot amend the BRs itself. However, past
> evidence
> > >

Following up on Trustico: reseller practices and accountability

2018-03-04 Thread Eric Mill via dev-security-policy
Last week, Trustico (a reseller, formerly for Symantec and now for Comodo)
sent 23,000 private keys to DigiCert, to force their revocation. This
showed that Trustico had been storing customer keys generated through one
or more CSR/key generation forms on their website.

Though Trustico disagrees, this appears to be a clear case of routine key
compromise for subscribers who obtained their key from Trustico. The
security of Trustico's systems, which are not audited or accountable to
root program requirements, were storing large amounts of key material whose
compromise could have led to the subsequent compromise of connections to
tens of thousands of online services.

It was also noted that Trustico was exposing key material to interception
by a number of third parties through client-side JavaScript embeds, and
that Trustico's website had functionality that allowed remote code
execution as root on one of their web servers.

These m.d.s.p threads document/link to those things:

*
https://groups.google.com/d/topic/mozilla.dev.security.policy/wxX4Yv0E3Mk/discussion
*
https://groups.google.com/d/topic/mozilla.dev.security.policy/BLvabFwcJqo/discussion

As part of the second thread, Comodo noted:

We also asked Trustico to cease offering any tools to generate and/or
retain customer private keys.  They have complied with this request and
have confirmed that they do not intend to offer any such tools again in the
future.


That is good to hear, but a "we won't do it again" response, if accepted by
Comodo as sufficient, seems disproportionate to the severity of the issue,
given Trustico's unfamiliarity with norms around private key management,
and with basic security practices.

It's also clear from the experience that rules of the road for resellers
are unclear, and that accountability is limited. It seems possible, or
likely, that other resellers may also be mishandling customer keys

So, what would useful next steps be to improve security and accountability
for resellers?

One thought: Mozilla could ask CAs to obtain a written response from all
contracted resellers about if/how they interact with customer key material,
including the level of isolation/security given their key generation
environment (if they have one), and whether any third-party JavaScript is
given access to generated key material.

Any other ideas?

Also -- Comodo noted:

Trustico have also confirmed to us that they were not, and are not, in
possession of the private keys that correspond to any of the certificates
that they have requested for their customers through Comodo CA.


Since there appears to have been a significant overlap period, between the
time Trustico switched to Comodo and when Trustico was asked by Comodo to
cease key storage practices, it's a little hard to take at face value the
assurance that Trustico was never in possession of any Comodo keys. It
would be nice to hear something from Comodo about whether they've verified
this in any more detail.

-- Eric

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: How do you handle mass revocation requests?

2018-02-28 Thread Eric Mill via dev-security-policy
Trustico doesn't seem to provide any hosting or CDN services that would
make use of the private key, nor do they appear to explicitly inform users
about the storage of this private key.

In their statement, they say they keep the private keys explicitly to
perform revocation as necessary:
https://www.trustico.com/news/2018/symantec-revocation/certificate-replacement.php
(archived: https://archive.is/0AnyR )

> These Private Keys are stored in cold storage, for the purpose of
revocation.

Their CSR/key generation form is here:
https://www.trustico.com/ssltools/create/csr-pem/create-a-new-csr-instantly.php
(archived: https://archive.is/hJV42 )

The storage of private keys appears to be done without the user's knowledge
or consent. And of course, only the keys they create through their form are
stored, so it is clearly not a necessary business function for most of
their certificate business.

Finally -- the CSR/key generation form page incorporates JavaScript from at
least 5 or 6 different companies (including ad servers), which would allow
any of those third parties (intentionally or through compromise of their
own) to capture generated keys. This is a reckless amount of exposure, to
the point that even if the keys were generated completely inside the
browser and never exposed to the server (which does not appear to be the
case), I would consider them compromised at the time of generation.

Given everything that's known, then regardless of who emailed whose
customers when and why, I think it's clear that Trustico compromised those
keys at _least_ at the time they were stored, if not at the time of
generation, and has been routinely compromising customer keys for years.
Emailing them to DigiCert only widened their exposure to more unauthorized
parties.

And given that there's no evidence that Trustico has acknowledged this
fact, or indicated any intent to change their business practices, then I
believe it's appropriate for all CAs to immediately suspend or terminate
their relationship with Trustico -- as any CA who continued doing business
with Trustico would now be knowingly allowing Trustico to compromise the
keys of the certificates issued under their hierarchy.

-- Eric

On Wed, Feb 28, 2018 at 3:24 PM, Ryan Hurst via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Wednesday, February 28, 2018 at 11:56:04 AM UTC-8, Ryan Sleevi wrote:
> > Assuming Trustico sent the keys to DigiCert, it definitely sounds like
> even
> > if Trustico was authorized to hold the keys (which is a troubling
> argument,
> > given all things), they themselves compromised the keys of their
> customers,
> > and revocation is both correct and necessary. That is, whether or not
> > Trustico believed they were compromised before, they compromised their
> > customers keys by sending them, and it's both correct and accurate to
> > notify the Subscribers that their keys have been compromised by their
> > Reseller.
>
> That seems to be the case to me as well.
>
> It also seems that this situation should result in the UAs and/or CABFORUM
> re0visit section 6.1.2 (https://github.com/cabforum/
> documents/blob/master/docs/BR.md) in the BRs.
>
> Specifically, this section states:
>
> ```
> Parties other than the Subscriber SHALL NOT archive the Subscriber Private
> Key without authorization by the Subscriber.
>
> If the CA or any of its designated RAs generated the Private Key on behalf
> of the Subscriber, then the CA SHALL encrypt the Private Key for transport
> to the Subscriber.
> ```
>
> In this case, TrustIco is not the subscriber, and there is no indication
> in their terms and conditions (https://www.trustico.com/
> terms/terms-and-conditions.php) that they are authorized to archive the
> private key. Yet clearly if they were able to provide 20k+ private keys to
> DigiCert they are archiving them. This text seems to cover this case
> clearly but as worded I do not see how audits would catch this behavior. I
> think it may make sense for the CAs to be responsible for demonstrating how
> they and other non-subscribers in the lifecycle flow handle this case.
>
> Additionally, it seems if the private keys were provided to DigiCert in a
> way they were verifiable by them they may have been stored in a
> non-encrypted fashion, at a minimum they were likley not generated and
> protected on an HSM. The BRs should probably be revised to specify some
> minimum level of security to be provided in these cases of for these cases
> to be simply disallowed altogether.
>
> Finally, the associated text speaks to RAs but not to the non-subscriber
> (reseller) case, this gap should be addressed minimally.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list

Re: Japan GPKI Root Renewal Request

2018-02-28 Thread Eric Mill via dev-security-policy
On Wed, Feb 28, 2018 at 12:58 AM, apca2.2013--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> "I would like to again point out that simply waiting for misissued
> certificates to expire is not an acceptable response."
>
> This is a misunderstanding.
> We are preparing to revoke certificates immediately, rather than waiting
> for certificates issued prior to 2017 to expire.
> However, even if we revoke those certificates, if your judgment is not
> affected and our request is rejected, there is no point in doing it.
>

So, to be clear, you would only revoke misissued certificates if required
to do so by Mozilla -- not because they represent control failures, or in
order to demonstrate to other root programs your CA's responsiveness and
the seriousness with which you take control failures.


> Please let us know if our request will be accepted by revoking all the
> certificates we issued prior to 2017.


> Thank you
> APCA
>
>
> 2018年2月28日水曜日 7時51分23秒 UTC+9 Wayne Thayer:
> > To conclude this discussion, Mozilla is denying the Japanese Government
> > ApplicationCA2 Root inclusion request.  I'd like to thank everyone for
> your
> > constructive input into the discussion, and I'd like to thank the
> Japanese
> > Government representatives for their patience and work to address issues
> as
> > they have been discovered. I will be resolving the bug as "WONTFIX".
> >
> > The Japanese Government PKI may submit a newly generated root and
> key-pair
> > for inclusion, and this submission can be made using the existing bug (
> > https://bugzilla.mozilla.org/show_bug.cgi?id=870185).
> >
> > On Thu, Feb 22, 2018 at 11:57 PM, apca2.2013--- via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> >
> > > We are a certificate authority controlled by the Government of Japan
> and
> > > issued only for servers operated by the government.
> > >
> > > For certificates that you point out concerning, they will expire and
> will
> > > be reissued, so we think that the problem will be solved.
> > >
> > > I would like to again point out that simply waiting for misissued
> > certificates to expire is not an acceptable response.
> >
> >
> > > We will continue to take BR audits in the future so we will operate as
> a
> > > secure certification authority and we appreciate your continued
> support.
> > >
> > > - Wayne
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates with 2008 Debian weak key bug

2018-02-05 Thread Eric Mill via dev-security-policy
WoSign and StartCom are untrusted, but Certum is still trusted, right?

On Mon, Feb 5, 2018 at 11:08 AM, Hanno Böck via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Hi,
>
> I searched crt.sh for valid certificates vulnerable to the 2008 Debian
> weak key bug. (Only 2048 bit.)
>
> Overall I found 5 unexpired certificates.
>
> Two certificates by Certum (reported on Saturday, Certum told me "We
> have taken necessary steps to clarify this situation as soon as
> possible", they're not revoked yet):
> https://crt.sh/?id=308392091=ocsp
> https://crt.sh/?id=663=ocsp
>
> Wosign:
> https://crt.sh/?id=30347743
> StartCom:
> https://crt.sh/?id=54187884
> https://crt.sh/?id=307753186
>
> As we all know these are no longer trusted by Mozilla, I reported them
> nevertheless. No reply yet.
>
> Old bugs never die, I recommend every CA adds a check for the Debian
> bug to their certificate issuance process.
>
> --
> Hanno Böck
> https://hboeck.de/
>
> mail/jabber: ha...@hboeck.de
> GPG: FE73757FA60E4E21B937579FA5880072BBB51E42
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Possible Issue with Domain Validation Method 9 in a shared hosting environment

2018-01-15 Thread Eric Mill via dev-security-policy
On Mon, Jan 15, 2018 at 4:22 PM, Ryan Sleevi <r...@sleevi.com> wrote:

>
>
> On Mon, Jan 15, 2018 at 4:11 PM, Eric Mill via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>> That said, GlobalSign's offer to cut certificate lifetimes down to X
>> months
>> during the short-term, and to make sure OneClick is disabled within Y
>> months from now, seems like a reasonable compromise that doesn't undercut
>> the incentive for GlobalSign or their customers to rapidly transition to
>> more secure methods. It seems like there should be a value of X and Y that
>> are acceptable.
>>
>
> There are a lot of factors to consider, as I tried to highlight, that
> contribute to whether or not X can be > 0 and whether Y can be > 0 (e.g. no
> issuance, immediate discontinuance) at all. That is, these additional
> factors beyond the protocol itself inherently contribute to whether or not
> there is a generalizable answer. Factors such as ecosystem impact versus
> ecosystem risk, existing practices that can be used as mitigating factors,
> the level of trust necessary to ascribe to the issuing CA (and the steps
> that are taken to mitigate failures of that trust) - all influence that
> calculus.
>

I can only go on what's on the public list, but if it is as it appears and
GS proactively researched their offering, identified a similar weakness via
a separate BR method, and voluntarily turned off their implementation right
away, that is the kind of activity I would think Mozilla and Google would
want to encourage (and not accidentally penalize). An X of 3 months (90
days) and a Y that resembled Let's Encrypt's deprecation timeline, might
help offer a lifeline without introducing unacceptable risk.

I understand that there are other factors, including the level of review
the protocol has been subject to and a holistic consideration of
GlobalSign's infrastructure and history, including non-public information,
and I'm not saying it would be necessarily unfair to keep GS' OneClick
offline for shared hosts. But I do think that incentivizing proactive
security interventions on the part of CAs is another factor worth
considering.

-- Eric

-- 
konklone.com | @konklone <https://twitter.com/konklone>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Possible Issue with Domain Validation Method 9 in a shared hosting environment

2018-01-15 Thread Eric Mill via dev-security-policy
On Mon, Jan 15, 2018 at 2:30 PM, Ryan Sleevi via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Mon, Jan 15, 2018 at 1:18 PM, Doug Beattie  >
> wrote:
>
> >
> > - The potential risk in maintaining this whitelist, given both the
> > statements provided by plans to move to deprecate this method post-haste
> > (e.g. no such plans) and the validity period of issued certificates (up
> to
> > 39 months or, soon, 825 days).
> >
> > Since LE can continue to renew certificates issued under this method
> prior
> > to this change, doesn’t that effectively allow longer effective validity
> > periods?  I recognize there is a difference between renewing and long
> > validity certs, but allowing renewal of certs issued under the flawed
> > method seems to reduce value of your argument here.
> >
>
> No, it doesn't, because in the event of misissuance, the attacker's ability
> is not the full duration (or 5 months, as you suggest), but bounded by the
> lifetime of the certificate. These are fundamentally different risks - and
> that's why the validity period of the certificate itself is far more
> important than the reuse period of the information. A victim can contact an
> ACME using CA to invalidate the information, thus preventing renewal, and
> the attacker is still bound to the lifetime of the existing certificate.
>
> Compare this with a certificate issued by 1-3 years by GlobalSign, in which
> even if a victim contacts GlobalSign, the most that GlobalSign can do is to
> revoke that certificate, which is ineffective at scale. This permits the
> attacker a far greater 'attack' window, even though GS might have revoked
> it, and is a key and fundamental difference.
>

I think this may be the key difference of perspective. GlobalSign might
view revocation as an effective attack mitigation for a victim, but I don't
think (and obviously Chrome doesn't think, given their lack of support for
revocation in the common case) that is likely to be effective.

If I were a victim, I would contact the ACME-using CA to invalidate the
reuse of domain validation information for those hostnames, which would be
a reliable technical control. I would also request revocation as a
best-effort thing, but I would not feel comfortable with the level of risk
I was experiencing (given the lack of effective revocation support in not
just Chrome, but also reams of other HTTP clients) until the expiration
date of the certificate had past.

That said, GlobalSign's offer to cut certificate lifetimes down to X months
during the short-term, and to make sure OneClick is disabled within Y
months from now, seems like a reasonable compromise that doesn't undercut
the incentive for GlobalSign or their customers to rapidly transition to
more secure methods. It seems like there should be a value of X and Y that
are acceptable.

-- Eric

-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Proposed policy change: require private pre-notification of 3rd party subCAs

2017-10-24 Thread Eric Mill via dev-security-policy
Ben, I think Gerv addressed Doug's concern and indicated that situation
wouldn't fall under this policy. If that's not accurate, it'd be worth an
on-list clarification.

On Tue, Oct 24, 2017 at 9:01 AM, Ben Wilson via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> I echo Doug's concerns.  I can see this process as being useful/helpful
> where the private key is off-premises, but for vanity CAs where the
> operator
> of the root CA maintains control of the private key the same as it does for
> all other issuing CAs, I think this would introduce unnecessary additional
> steps.
>
> -Original Message-
> From: dev-security-policy
> [mailto:dev-security-policy-bounces+ben=digicert@lists.mozilla.org] On
> Behalf Of Doug Beattie via dev-security-policy
> Sent: Tuesday, October 24, 2017 9:44 AM
> To: Gervase Markham ;
> mozilla-dev-security-pol...@lists.mozilla.org
> Subject: RE: Proposed policy change: require private pre-notification of
> 3rd
> party subCAs
>
> Gerv,
>
> I assume this applies equally to cross signing, but not to "Vanity" CAs
> that
> are set up and run by the CA on behalf of a customer.  Is that accurate?
>
> Doug
>
> > -Original Message-
> > From: dev-security-policy [mailto:dev-security-policy-
> > bounces+doug.beattie=globalsign@lists.mozilla.org] On Behalf Of
> > Gervase Markham via dev-security-policy
> > Sent: Tuesday, October 24, 2017 11:28 AM
> > To: mozilla-dev-security-pol...@lists.mozilla.org
> > Subject: Proposed policy change: require private pre-notification of
> > 3rd party subCAs
> >
> > One of the ways in which the number of organizations trusted to issue
> > for the WebPKI is extended is by an existing CA bestowing the power of
> > issuance upon a third party in the form of control of a
> > non-technically-constrained subCA. Examples of such are the Google and
> > Apple subCAs under GeoTrust, but there are others.
> >
> > Adding new organizations to the list of those trusted is a big deal,
> > and currently Mozilla has little pre-insight into and not much control
> > over this process. CAs may choose to do this for whoever they like,
> > the CA then bears primary responsibility for managing that customer,
> > and as long as they are able to file clean audits, things proceed as
> normal.
> >
> > Mozilla is considering a policy change whereby we require private pre-
> > notification of such delegations (or renewals of such delegations).
> > We would not undertake to necessarily do anything with such
> > notifications, but lack of action should not be considered permissive in
> an estoppel sense.
> > We would reserve the right to object either pre- or post-issuance of
> > the intermediate. (Once the intermediate is issued, of course, the CA
> > has seven days to put it in CCADB, and then the relationship would
> > probably become known unless the fields in the cert were misleading.)
> >
> > This may not be where we finally want to get to in terms of regulating
> > such delegations of trust, but it is a small step which brings a bit
> > more transparency while acknowledging the limited capacity of our team
> > for additional tasks.
> >
> > Comments are welcome.
> >
> > Gerv
> > ___
> > dev-security-policy mailing list
> > dev-security-policy@lists.mozilla.org
> > https://lists.mozilla.org/listinfo/dev-security-policy
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
>


-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Incident Report D-Trust certificates with ROCA fingerprints

2017-10-23 Thread Eric Mill via dev-security-policy
Hi Kim,

I appreciate that you've reported this to m.d.s.p despite the certificates
not chaining to an NSS-trusted path. However, since you've called it an
"incident report" and said you would treat this as an incident as if it
were related to NSS trust, I feel I need to point out that this doesn't
appear to contain the information the community would want to see out of an
incident report.

The details here are related overwhelmingly to German and EU law, and do
not contain concrete information about how the weaknesses were introduced,
how and when they were resolved, and what (if anything) D-Trust is doing
differently as a result.

You may have felt these were implied or not that important, given that
German smart cards are now less relevant post-eIDAS, but I think for this
to qualify as an incident report, you should make those details explicit.

-- Eric

On Thu, Oct 19, 2017 at 6:45 AM, Kim Nguyen via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Incident Report  D-Trust certificates with ROCA fingerprints
>
> A list of certificates showing a ROCA fingerprint was posted by Rob
> Stradling at Mozilla.dev.security.policy on 2017/10/18 available at
> https://misissued.com/batch/28/
> This contains among other certificates a number of D-Trust related
> certificates that all show a ROCA fingerprint.
>
> All of these certificates are not related to WebPKI, i.e. are not chaining
> to a root trusted by NSS, but were part of the German qualified signature
> scheme under the supervision of the German supervisory body
> Bundesnetzagentur (BNetzA). The German qualified signature scheme mandated
> the sole use of specific smartcards under a specific German scheme
> (“Bestätigung nach Signaturgesetz (SigG)” for the operation of a qualified
> PKI infrastructure according to this scheme. Qualified TSPs were bound to
> this by law.
> Smartcards in the German scheme were required to fulfill both a composite
> Common Criteria certification according to the relevant protection profiles
> as well as a specific qualification according to the German scheme. All
> components used by D-Trust during the applicability of the German
> Signaturgesetz met these requirements as confirmed by yearly audits by the
> accredited conformity assessment body TüvIT.
>
> The German qualified signature scheme was superseded by the EU eIDAS
> regulation, which overrules national signature law in the EU. The eIDAS
> regulation became mandatory at the 1st of July 2017 after a one year
> transition period. Therefore all certificates related to D-Trust at
> https://misissued.com/batch/28/ where deactivated during June 2017 and
> revoked later in order to comply with the new eIDAS requirements which
> include an eIDAS conformity assessment as well as various technical
> adaptions. The trust status of these certificates can be validated in the
> German Trusted List (TSL) located at https://www.nrca-ds.de/ which is the
> centeal point of trust according to the eIDAS regulation and where the
> respective status is shown as withdrawn.
>
> In the course of this transition smart cards were abandoned as the new
> eIDAS regulation now allows for a HSM based infrastructure inside a
> qualified TSP (contrary to the former situation according to the German
> Signature law).
>
> Therefore all mentioned D-Trust related certificates at
> https://misissued.com/batch/28/  are now deactived and revoked and the
> related services are shown as withdrawn in the German TSL. Please note that
> a considerable part of these certificates were derived from a root operated
> by the supervisory body BNetzA as they were part of the so-called
> accredited qualified signature scheme as mandated by national German
> signature law.
>
> Please note that all WebPKI related systems within D-Trust are not
> affected by the issue of weak RSA key generation in Infineon components as
> all of these systems are HSM based.
>
> Kim Nguyen, D-Trust
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Mozilla’s Plan for Symantec Roots

2017-10-16 Thread Eric Mill via dev-security-policy
Adding code to Firefox to support the distrust of specified subCAs seems
like it would be a good long-term investment for Mozilla, as it would give
Mozilla a lot more flexibility during future distrust events.

-- Eric

On Mon, Oct 16, 2017 at 1:32 PM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> As per previous discussions and
> https://wiki.mozilla.org/CA:Symantec_Issues, a consensus proposal[0] was
> reached among multiple browser makers for a graduated distrust of
> Symantec roots.
>
> Here is Mozilla’s planned timeline for the graduated distrust of
> Symantec roots (subject to change):
>
> * January 2018 (Firefox 58): Notices in the Developer Console will warn
> about Symantec certificates issued before 2016-06-01, to encourage site
> owners to migrate their TLS certs.
>
> * May 2018 (Firefox 60): Websites will show an untrusted connection
> error if they have a TLS cert issued before 2016-06-01 that chains up to
> a Symantec root.
>
> * October 2018 (Firefox 63): Removal/distrust of Symantec roots, with
> caveats described below.
>
> Mozilla’s release calendar is here:
> https://wiki.mozilla.org/RapidRelease/Calendar
>
> However, there are some subCAs of the Symantec roots that are
> independently operated by companies whose operations have not been
> called into question, and they will experience significant hardship if
> we do not provide a longer transition period for them. For both
> technical and non-technical reasons, a year is an extremely unrealistic
> timeframe for these subCAs to transition to having their certificates
> cross-signed by another CA. For example, the subCA may have implemented
> a host of pinning solutions in their products that would fail with
> non-Symantec-chaining certificates, or the subCA may have large numbers
> of devices that would need to be tested for interoperability with any
> potential future vendor. And, of course contractual negotiations may
> take a significant amount of time.
>
> The subCAs that we know of that fall into this category belong to Google
> and Apple. If there are any other subCAs that fall into this category,
> please let us know immediately. Google has one such subCA; Apple has seven.
>
> There are two ways that we can provide a smoother transition for these
> subCAs.
>
> Option 1)
> Temporarily treat these subCAs as directly-included trust-anchors.
> Mozilla prefers *not* to take this approach, because even if clearly
> explained up front that it is a temporary solution with deadlines, it
> would be very easy for people to start treating such a subCA as a
> regular trust anchor, and thereby have that subCA become a de facto
> included CA. Additionally, it could become very complicated to remove
> such subCAs in the future, especially if they have not performed the
> recommended transitions.
>
> Option 2)
> Add code to Firefox to disable the root such that only certain subCAs
> will continue to function. So, the final dis-trust of Symantec roots may
> actually involve letting one or two of the root certs remain in
> Mozilla’s trust store, but having special code to distrust all but
> specified subCAs. We would document the information here:
> https://wiki.mozilla.org/CA/Additional_Trust_Changes
> And Mozilla would add tooling to the CCADB to track these special subCAs
> to ensure proper CP/CPS/audits until they have been migrated and
> disabled, and the root certs removed. Mozilla will need to also follow
> up with these subCAs to ensure they are moving away from these root
> certificates and are getting cross-signed by more than one CA in order
> to avoid repeating this situation.
>
> According to option 2 and the plan listed above, here is how the
> currently-included Symantec root certs will be treated in Firefox 63:
>
> = Symantec roots to be disabled via code, *not* removed from NSS =
>
> GeoTrust Global CA
> GeoTrust Primary Certification Authority - G2
> GeoTrust Primary Certification Authority - G3
>
> = Symantec roots that will be fully removed from NSS =
>
> GeoTrust Primary Certification Authority
> GeoTrust Universal CA
> GeoTrust Universal CA 2
> Symantec Class 1 Public Primary Certification Authority - G4
> Symantec Class 1 Public Primary Certification Authority - G6
> Symantec Class 2 Public Primary Certification Authority - G4
> Symantec Class 2 Public Primary Certification Authority - G6
> thawte Primary Root CA
> thawte Primary Root CA - G2
> thawte Primary Root CA - G3
> VeriSign Class 1 Public PCA - G3
> VeriSign Class 2 Public PCA - G3
> VeriSign Class 3 Public Primary Certification Authority - G3
> VeriSign Class 3 Public Primary Certification Authority - G4
> VeriSign Class 3 Public Primary Certification Authority - G5
> VeriSign Universal Root Certification Authority
>
> As always, we appreciate your thoughtful and constructive feedback on this.
>
> Gerv
>
> [0]
> https://groups.google.com/a/chromium.org/forum/#!topic/
> blink-dev/eUAKwjihhBs%5B251-275%5D
> 

Re: PROCERT issues

2017-09-29 Thread Eric Mill via dev-security-policy
On Thu, Sep 28, 2017 at 12:50 PM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 27/09/17 18:54, Matthew Hardeman wrote:
> > In the case of StartCom, I can not help but feel that they are being
> > held to an especially high standard (higher than other prior adds to
> > the program) in this new PKI because of who they are -- despite the
> > fact that management and day-to-day decisions are a completely
> > different team.
> >
> > Where I am headed with this is a concern that perhaps no amount of
> > technical remediation can really get these entities back in the
> > graces of the community.
>
> I don't know if it's quite as absolute as that, but recent incidents
> have caused me to ponder somewhat on the nature of trust. The root
> program is all about trust, and trust is not something which can be
> encoded in audits, checkboxes and rules. This will always be a tension
> at the heart of our root program - we are trying to be as objective as
> we can about something which is ultimately subjective.
>
> The nature of trust is that it's harder to regain than it is to gain in
> the first place. Just ask someone who's been the victim of adultery - or
> someone who is a now-repentant adulterer. Rightly or wrongly, people get
> a first chance, but it's tough to get a second. I think you are right
> when you conclude that this is just the way of things, and we should
> accept it rather than kick against it.
>

That dynamic is natural, but accepting that this dynamic exists is
different than giving into it in some absolute way. When offering second
chances, requiring that the person/org fulfill certain conditions that
speak directly to their ability to have learned and adapted from the thing
they failed at the first time is an approach that accepts this dynamic,
without shutting the door on people or organizations that have grown as a
result of the experience.

I think it would arguably lead to worse behavior, and less disclosure of
incidents and mistakes, if Mozilla adopted a posture where second chances
are rarely given. Not saying that's what's being said here, but I think
it's worth emphasizing that the first principle here should be to optimize
for incentivizing the behavior you want out of the CA community that
protects users and increases information sharing.

-- Eric


>
> Gerv
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: FW: StartCom inclusion request: next steps

2017-09-17 Thread Eric Mill via dev-security-policy
I didn't understand the original below comment by StartCom very well about
the cross-sign, but after Ryan's message I understand it better in
retrospect:

> On Thu, Sep 14, 2017 at 11:05 AM, Inigo Barreira via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:
>
> I´ve never said this. In fact, despite having that cross-signed which
were provided to us in july we have never used and provided to any of our
customers to build a trusted path. So none of those 5, or the new ones,
go with the Certinomis path because none have it. But all those 5 certs
are untrusted because we´re not in the Mozilla root, not the new one, and
the old one was distrusted.
>
> In fact, recently, I asked for permission to use the Certinomis
cross-signed certificates and have no response. I don´t know if this is an
administrative silence which may allow me to use it but until having a
clear direction we haven´t used it.

So this appears to be saying that "all those 5 certs are untrusted"
because StartCom didn't provide the full chain to customers, even though
such a chain could be constructed. The cross-signature wasn't published in
CT until August 2nd, but that's not any sort of guarantee that the
cross-signature wasn't discoverable by other means -- its availability
until August 2nd is a function of actions by Certinomis that are not
disclosed. The August 2nd date is also after StartCom's actions were being
publicly questioned, so it suggests the possibility that the
cross-signature would have been kept secret for longer, and was only
submitted to CT once scrutiny had increased.

Whether the cross-cert was issued before the audit report date is also a
mystery, especially if it's possible that either Certinomis or StartCom was
operating under the assumption that the cross-signature is irrelevant until
"delivered" to customers.

StartCom has remarked several times in this thread that they are being
treated unfairly, but I can think of at least one comparison to a previous
distrust event, which is that one of the more significant (in my opinion)
issues with Symantec's now-deprecated PKI is that there existed chains that
brought U.S. Federal PKI certificates into being trusted by Mozilla. Those
chains were, as far as I know, never delivered proactively to customers,
but could easily be constructed by any interested party with sufficient
knowledge of the universe of cross-signatures. For example, Qualys' SSL
Labs reports would automatically construct those chains for sites using
FPKI certs, and let users download the full chain in one click.

The threat model here is not what ordinary inexpert customers do, but what
opportunities an adversary has available to them among the universe of
trusted CAs to obtain certificates. In the Symantec/FPKI case, the problem
was that an adversary could easily use an FPKI certificate to intercept
connections made by Mozilla products, whether or not Symantec or the FPKI
ever advertised or proactively enabled this use case. What made this such a
big issue, in addition to the scope of the technical impact, is that the
issue was not noticed or elevated for years, during which multiple
"generations" of cross-signs had been issued and expired. It brought
Symantec's ability to understand their own PKI into serious question.

So I think the biggest issue here is not so much the technical impact, but
that StartCom was communicating inaccurate information to Mozilla. The
certs were publicly trusted by Certinomis, whether the cross-signature was
delivered to StartCom or to customers or to no one. While presumably this
inaccuracy was unintentional, it was enough to cause Gerv to express public
confusion and doubt about whether the certificates were part of the
cross-signed hierarchy. It also reflects a potentially dangerous difference
of perspective between StartCom and root stores in how StartCom evaluates
the trust and impact of the certificates they issue. For a CA that has been
operating for as long as StartCom has, I think it's fair to describe this
as concerning.

I also think that Certinomis, whose cross-signing practices are now being
scrutinized, should proactively post to this list with a timeline of its
own actions during this process, so that their actions can be understood in
the context of StartCom's.

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: O=U.S. Government for non-USG entity (IdenTrust)

2017-08-31 Thread Eric Mill via dev-security-policy
Thank you for the continued updates, and for relaying the deadline by which
these will be revoked.

On Thu, Aug 31, 2017 at 9:35 PM, identrust--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Monday, August 28, 2017 at 3:28:01 PM UTC-4, iden...@gmail.com wrote:
> > On Friday, August 18, 2017 at 7:22:06 PM UTC-4, iden...@gmail.com wrote:
> > > On Thursday, August 17, 2017 at 2:35:15 PM UTC-4, Jonathan Rudenberg
> wrote:
> > > > > On Aug 17, 2017, at 14:24, identrust--- via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
> > > > >
> > > > > Hello, In reference to 3)"Certificates that appear to be intended
> as client certificates, but have the anyExtendedKeyUsage EKU, putting them
> in scope for the Mozilla Root Policy."
> > > > > The following 6 client certificates that have been identified as
> server certificates and have been flagged as non-compliant.  However, these
> certificates do not contain FQDN, IP Address, nor ‘TLS Web Server
> Authentication’ EKU.  As such in order for us to proceed with our analysis
> and determine if any remediation is required, we need clarification in the
> exact nature of non-compliance as it relates to Mozilla Root Policy or CAB
> Forum Baseline Requirement (ideally with pointer to the specific
> requirement in the corresponding documents).
> > > >
> > > > The Mozilla Root Store Policy section 1.1 (Scope) says:
> > > >
> > > > > This policy applies, as appropriate, to certificates matching any
> of the following (and the CAs which control or issue them):
> > > > > …
> > > > > 3. End-entity certificates which have at least one valid,
> unrevoked chain up to such a CA certificate through intermediate
> certificates which are all in scope, such end-entity certificates having
> either:
> > > > > - an Extended Key Usage (EKU) extension which contains one or more
> of these KeyPurposeIds: anyExtendedKeyUsage, id-kp-serverAuth,
> id-kp-emailProtection; or: …
> > > >
> > > > The six certificates linked contain the anyExtendedKeyUsage
> KeyPurposeId and were issued by an intermediate that is also in scope, so
> they are in scope for the Mozilla Root Policy and by extension the Baseline
> Requirements.
> > > >
> > > > Jonathan
> > >
> > > As an update to the reported issue of misclassification of client
> certificates as server certificates, based on our continuing internal
> investigations, feedback from our user community, and also taking into
> account the feedback posted in this forum, we plan to proceed as follows:
> > > 1.Nolater than August 31, 2017 we will discontinue new or reissuance
> of human certificate with the anyExtendedKeyUsage extension from all
> IdenTrust ACES CAs.
> > > 2.We will allow continued use of the current certificates and replace
> or let them expire through natural lifecycle because:
> > > a. These certificates are not sever certificates
> > > b. All certificates issued are from audited CA(s) with public
> disclosure of audit result
> > > c. The legacy application usage requires anyExtendedKeyUsage extension
> at the present time though we are phasing out support of such application.
> > > d. These certificates do not pose a security concern meriting
> immediate revocation
> > > e.  Replacement of these certificates will result in significant
> negative impact on our customers.
> >
> > Effective August 28, 2017, IdenTrust has discontinued new issuance or
> reissuance of human certificates with the anyExtendedKeyUsage extension
> from all IdenTrust ACES CAs.
>
>
> IdenTrust continues to work our customers in revoking/replacing ACES SSL
> certificates with these reported issues:
> - https for OCSP validation instead of http in AIA extension;
> - Invalid “US Government” as o= in the SDN;
> - Invalid OtherName in the SAN extension.
> For those customers that have not replaced their certificates by September
> 15, 2017, we will revoke their them.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates with less than 64 bits of entropy

2017-08-19 Thread Eric Mill via dev-security-policy
On Fri, Aug 18, 2017 at 12:04 PM, Stephen Davidson via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:
>
> 4)  The list of affected certificates is attached in spreadsheet
> form;  they will be uploaded to CT as well.  You will note that the number
> has declined – Siemens' previous report did not take into account that some
> of the certificates had already previously been revoked for other
> reasons.   The spreadsheet also includes certificates issued during the
> Digicert/Verizon root signing period.
>

Would you mind posting this to a public URL or to a Bugzilla bug? The list
doesn't transmit attachments.

-- Eric


>
> Regards, Stephen
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: AC FNMT Usuarios and anyExtendedKeyUsage

2017-08-18 Thread Eric Mill via dev-security-policy
Hi Jose,

Apologies, on looking back through m.d.s.p, it's clear attachments aren't
processed by the list configuration. Would you be able to post it to a URL,
or attach it to a bugzilla bug?

-- Eric

On Fri, Aug 18, 2017 at 10:01 AM, Eric Mill  wrote:

> Hi Jose,
>
> There was no attachment to your email. Would you mind re-sending with an
> attachment?
>
> On Fri, Aug 18, 2017 at 8:17 AM, Jose Manuel Torres via
> dev-security-policy  wrote:
>
>> Hello everyone,
>>
>> In response to the questions raised:
>>
>> AC FNMT Usuarios do not issue TLS / SSL certificates, as evidenced by the
>> attached document: Audit Attestation - ETSI Assestment 2017, FNMT CA's and
>> TSU's.
>>
>> Regarding anyExtendedKeyUsage EKU, since January 2017 it is no longer
>> incorporated into the certificates issued by AC FNMT Usuarios so it should
>> not be possible
>> to use it for TLS server authentication.
>>
>> In this sense the certificate indicated in this incident was issued prior
>> to the change indicated.
>>
>> Taking these considerations into account, FNMT considers that a revocation
>> of the intermediate CA by OneCRL is not necessary.
>> ___
>> dev-security-policy mailing list
>> dev-security-policy@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-security-policy
>>
>
>
>
> --
> konklone.com | @konklone 
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: AC FNMT Usuarios and anyExtendedKeyUsage

2017-08-18 Thread Eric Mill via dev-security-policy
Hi Jose,

There was no attachment to your email. Would you mind re-sending with an
attachment?

On Fri, Aug 18, 2017 at 8:17 AM, Jose Manuel Torres via dev-security-policy
 wrote:

> Hello everyone,
>
> In response to the questions raised:
>
> AC FNMT Usuarios do not issue TLS / SSL certificates, as evidenced by the
> attached document: Audit Attestation - ETSI Assestment 2017, FNMT CA's and
> TSU's.
>
> Regarding anyExtendedKeyUsage EKU, since January 2017 it is no longer
> incorporated into the certificates issued by AC FNMT Usuarios so it should
> not be possible
> to use it for TLS server authentication.
>
> In this sense the certificate indicated in this incident was issued prior
> to the change indicated.
>
> Taking these considerations into account, FNMT considers that a revocation
> of the intermediate CA by OneCRL is not necessary.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates issued with HTTPS OCSP responder URL (IdenTrust)

2017-08-15 Thread Eric Mill via dev-security-policy
On Tue, Aug 15, 2017 at 2:47 PM, identrust--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> We have been moderately successful in replacing the five (5)
> certificates.  One (1) has been voluntarily replaced, we have a commitment
> from our client to initiate a replacement for one (1) tomorrow and three
> (3) have been revoked by IdenTrust.
>

Thank you for this -- this information is very helpful to the community in
evaluating ongoing impact to clients, and in how specific issues are being
handled beyond the expected 24-hour timespan.

-- Eric


> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates issued with HTTPS OCSP responder URL (IdenTrust)

2017-08-14 Thread Eric Mill via dev-security-policy
On Fri, Aug 11, 2017 at 4:43 PM, identrust--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Thursday, August 10, 2017 at 11:51:54 PM UTC-4, Eric Mill wrote:
> > On Thu, Aug 10, 2017 at 11:34 AM, identrust--- via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> > >
> > > We acknowledge seeing this issue and are looking into it.
> > > Details will be supplied as soon we can but not later that today’s end
> of
> > > business day.
> > >
> >
> > Thanks for looking into it. It's coming up on the end of the day - do you
> > have an update?
> >
> > -- Eric
> >
> >
> > > ___
> > > dev-security-policy mailing list
> > > dev-security-policy@lists.mozilla.org
> > > https://lists.mozilla.org/listinfo/dev-security-policy
> > >
> >
> >
> >
> > --
> > konklone.com | @konklone 
>
> IdenTrust is fully aware of the situation and has consulted with internal
> and external parties to ensure that our course of action is appropriate and
> commensurate with our business practices and accommodates our customer’s
> needs.
> When IdenTrust initially established the ACES SSL profile, it was intended
> to apply only to US government entities.  At that time, the Organization
> was defined as a static value of “U.S. Government” in our profiles.  Later,
> when non-agencies applied, IdenTrust reasoned at that time that this static
> value continued to be acceptable as these entities must identify themselves
> as organizations that act as relying parties, accepting certificates issued
> under the ACES program, and are in some capacity associated with the U.S.
> Government.  We have discussed internally and taken a fresh look at this
> decision.   As a result, IdenTrust has updated the ACES SSL profile to use
> the applicant Organization name in the Subject DN organization field.  This
> change will accommodate all applications for ACES SSL certificates, both
> U.S. agencies and non-agencies.  At the same time, we have modified the
> OCSP validation URL from HTTPS to HTTP.
> It is important to note that all certificates that are impacted by this
> situation have been appropriately vetted by the IdenTrust Registration team
> according to Identity Validation requirements stated in the ACES CP,
> therefore the need to revoke affected certificates immediately is less
> critical.  Our key objective is to revoke all incorrect certificates as
> quickly as possible, while minimizing the impact to our customers and
> avoiding disruption to critical business processes.  As such, IdenTrust is
> working directly with these customers to initiate a replacement for the
> offending certificates.  The replacement process allows the client to use
> an online mechanism to request a new certificate with the correct
> attributes and immediately download the new certificate.  The replacement
> process also automatically revokes the certificate being replaced.  This
> will enable our clients to receive a newly vetted certificate and they will
> not be inconvenienced by a forced revocation, which would likely adversely
> impact their business processes. IdenTrust will ultimately force a
> revocation, in the event that the clients do not initiate a certificate
> replacement in response to our communications.
>

Thanks for the background and the detail. Given that you're intentionally
ignoring the 24-hour revocation rule, could you at least provide an
estimate of when Identrust will force revocations to be done by? Have
clients initiated prompt replacements?

-- Eric


>
> Thank you for the opportunity to represent our position.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificate issued by D-TRUST SSL Class 3 CA 1 2009 with short SerialNumber

2017-08-14 Thread Eric Mill via dev-security-policy
Hi Arno, Martin,

On Mon, Aug 14, 2017 at 11:37 AM, Arno Fiedler via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> As result we confirm to do the following steps and report about the
> implementation latest until 15-09-2017
> •   Contact all effected customers, inform them and get the certs
> replaced (includes revocation)


Can you be a bit more detailed about this step? By what date will all
affected certs be revoked?

-- Eric



>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates with less than 64 bits of entropy

2017-08-12 Thread Eric Mill via dev-security-policy
If they're not going to revoke within 24 hours and willingly violate that
part of the policy, I would at least expect them to, within that 24 hours,
produce a description of why this happened, what they're doing to fix it,
and when they expect the certificates to be replaced (along with an
expectation of when a hard revocation deadline would be regardless of
customer responsiveness). Once the underlying issue is fixed, I would
expect them to ring in to say that it's fixed and what they did to fix it.

That's just basic good-faith engagement that demonstrates that the issuing
CA at least takes the issue as seriously as the community does, and
engenders trust that the issue is being addressed.

Let's Encrypt just responded this week to an encoding compliance failure
with a live production code fix (including code review and sign off) within
6 hours of being notified.

While not every issuing CA may take security seriously enough to employ
engineers on staff who can research, author and deploy a production code
fix in a 24 hour period, every issuing CA should be able to muster the
strength to keep the community informed of their plans and progress in
however long it takes to address the issue.

-- Eric

On Fri, Aug 11, 2017 at 10:33 AM, Ben Wilson via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Apparently they haven’t yet, but we’ll assume that they will.
>
> Does the community expect a remediation plan for their code and then a
> revocation-and-replacement plan?
>
>
>
> Ben Wilson, JD, CISA, CISSP
>
> VP Compliance
>
> +1 801 701 9678
>
>
>
>
>
> From: Alex Gaynor [mailto:agay...@mozilla.com]
> Sent: Friday, August 11, 2017 8:31 AM
> To: Ben Wilson 
> Cc: Jeremy Rowley ; Jonathan Rudenberg <
> jonat...@titanous.com>; mozilla-dev-security-pol...@lists.mozilla.org
> Subject: Re: Certificates with less than 64 bits of entropy
>
>
>
> Have they fixed whatever issue there is with their PKI infrastructure that
> leads to this issue? From skimming, I see this pool contains certs issued
> as recently as one month ago.
>
>
>
> Alex
>
>
>
> On Fri, Aug 11, 2017 at 10:26 AM, Ben Wilson via dev-security-policy <
> dev-security-policy@lists.mozilla.org  sts.mozilla.org> > wrote:
>
> With regard to Siemens, given the large number of certificates and the
> disruption that massive revocations will have on their infrastructure, what
> does this community expect them to do?
>
>
> -Original Message-
> From: dev-security-policy [mailto:dev-security-policy-bounces+ben  dev-security-policy-bounces%2Bben> =digicert@lists.mozilla.org
>  ] On Behalf Of Jeremy Rowley via
> dev-security-policy
> Sent: Thursday, August 10, 2017 12:01 PM
> To: Jonathan Rudenberg  >; mozilla-dev-security-pol...@lists.mozilla.org
> 
> Subject: RE: Certificates with less than 64 bits of entropy
>
> Hi Jonathan,
>
> InfoCert's sub CA was revoked on August 1, 2017. We'll reach out to
> Siemens. They moved to Quovadis a while ago and are no longer issuing from
> that Sub CA.
>
> Jeremy
>
> -Original Message-
> From: dev-security-policy [mailto:dev-security-policy-bo
> unces+jeremy.rowley  =
> digicert@lists.mozilla.org  ]
> On Behalf Of Jonathan Rudenberg via dev-security-policy
> Sent: Thursday, August 10, 2017 9:26 AM
> To: mozilla-dev-security-pol...@lists.mozilla.org  mozilla-dev-security-pol...@lists.mozilla.org>
> Subject: Re: Certificates with less than 64 bits of entropy
>
>
> > On Aug 10, 2017, at 11:20, Jonathan Rudenberg via dev-security-policy <
> dev-security-policy@lists.mozilla.org  sts.mozilla.org> > wrote:
> >
> > QuoVadis (560)
> >Siemens Issuing CA Internet Server 2016 (560)
> >
> > D-TRUST (224)
> >D-TRUST SSL Class 3 CA 1 2009 (178)
> >D-TRUST SSL Class 3 CA 1 EV 2009 (45)
> >D-TRUST Root Class 3 CA 2 EV 2009 (1)
> >
> > DigiCert (85)
> >Siemens Issuing CA Class Internet Server 2013 (82)
> >InfoCert Web Certification Authority (3)
> >
> > Izenpe S.A. (62)
> >EAEko Herri Administrazioen CA - CA AAPP Vascas (2) (62)
> >
> > Government of The Netherlands, PKIoverheid (Logius) (55)
> >Digidentity Services CA - G2 (55)
> >
> > Government of Turkey, Kamu Sertifikasyon Merkezi (Kamu SM) (38)
> >Cihaz Sertifikası Hizmet Sağlayıcı - Sürüm 4 (38)
>
> It looks like my summary missed one QuoVadis intermediate:
>
> Bayerische SSL-CA-2016-01 (3)
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org  sts.mozilla.org>
> https://lists.mozilla.org/listinfo/dev-security-policy
>
>
> 

Re: 2017.08.10 Let's Encrypt Unicode Normalization Compliance Incident

2017-08-12 Thread Eric Mill via dev-security-policy
On Fri, Aug 11, 2017 at 5:20 PM, Matthew Hardeman via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> If one integrates a project like certlint/cablint into the cert issuance
> pipeline, one suddenly takes on supplemental responsibility for certlint's
> bugs or changes.
>

That's the case for any source code Let's Encrypt uses that was written by
someone else. Like all software, there are third party dependencies in
there somewhere, whether closed source or open source. (In Let's Encrypt's
case, they are generally open source, which helps the team's ability to
review it.)


> The pace of change in certlint, just glancing at the git commits, is not
> slow.  New tests are added.  Tests are revised.
>

That's a good thing.

Even still... anywhere along the way, Mr. Bowen could go totally evil (I
> seriously doubt this would happen) and decide that certlint should flag "E:
> This CA is run by nasty people" anytime Issuer CN contains 'Maligned CA X1'.
>

You seem to be assuming that Let's Encrypt would just automatically pull
down new code into its critical issuance code path without review. I would
definitely not assume that. That code will be reviewed before deployment.


> It seems reasonable to me that an implementing CA might want to add some
> buffer between the initial commit/merge and their opportunity to perform
> some manual review of the changes prior to incorporating into their
> issuance environment.
>

Yes, of course.

This is a lot of time to spend discussing the basics of project dependency
management. There are definitely tradeoffs for Let's Encrypt to evaluate
when considering something like integrating certlint into the issuance
pipeline -- performance of the certlint tool, potential memory leaks, as
well as the operations necessary to support a hosted service that keeps
certlint in memory for rapid processing.

If certlint proves to be too slow or take too much memory, then an
integration push could either cause those issues to be fixed, or cause a
new tool to be written that performs the same checks certlint does (now
that the work has been done to map and isolate the BRs into specific
technical checks).

We should be understanding if engineering tradeoffs preclude immediate
integration, but we should not dismiss the idea of relying on "someone
else's code" in the issuance pipeline. I'm sure that's already the case for
every CA in operation today.

-- Eric


___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates issued with HTTPS OCSP responder URL (IdenTrust)

2017-08-10 Thread Eric Mill via dev-security-policy
On Thu, Aug 10, 2017 at 11:34 AM, identrust--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:
>
> We acknowledge seeing this issue and are looking into it.
> Details will be supplied as soon we can but not later that today’s end of
> business day.
>

Thanks for looking into it. It's coming up on the end of the day - do you
have an update?

-- Eric


> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Certificates issued with HTTPS OCSP responder URL (IdenTrust)

2017-08-09 Thread Eric Mill via dev-security-policy
On Wed, Aug 9, 2017 at 4:28 PM, Lee <ler...@gmail.com> wrote:

> On 8/9/17, Eric Mill via dev-security-policy
> <dev-security-policy@lists.mozilla.org> wrote:
> > On Tue, Aug 8, 2017 at 5:53 PM, identrust--- via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> >
> >> On Tuesday, August 8, 2017 at 12:06:47 PM UTC-4, Jonathan Rudenberg
> wrote:
> >> > > On Aug 8, 2017, at 10:29, identrust--- via dev-security-policy <
> >> dev-security-policy@lists.mozilla.org> wrote:
> >> > >
> >> > > On Monday, August 7, 2017 at 4:47:39 PM UTC-4, Jonathan Rudenberg
> >> wrote:
> >> > >> “IdenTrust ACES CA 2” has issued five certificates with an OCSP
> >> responder URL that has a HTTPS URI scheme. This is not valid, the OCSP
> >> responder URI is required to have the plaintext HTTP scheme according to
> >> Baseline Requirements section 7.1.2.2(c).
> >> > >>
> >> > >> Here’s the list of certificates: https://misissued.com/batch/4/
> >> > >>
> >> > >> Jonathan
> >> > >
> >> > > IdenTrust had previously interpreted HTTP to be inclusive of HTTPS
> in
> >> this
> >> > > context.  That being said, we have altered our profiles for
> >> certificates
> >> > > issued under this Sub CA to include only HTTP OCSP URLs.  All
> >> certificates
> >> > > issued going forward will contain an HTTP OCSP URL.  We will also
> >> examine all
> >> > > other sub CA to ensure only HTTP OCSP URLs are included.  Thank you
> >> for giving
> >> > > us an opportunity to address this with the community
> >> >
> >> > Thanks for the update.
> >> >
> >> > Can you also clarify why the subject organizationName is "U.S.
> >> Government” for all of these certificates, despite the other subject
> >> fields
> >> indicating organizations that are not a component of the US Government?
> >> >
> >> > Jonathan
> >>
> >> Yes,
> >> IdenTrust ACES SSL Certificates are issued in accordance with the ACES
> >> certificate policy defined by U.S. General Service Administration (
> >> http://csrc.nist.gov/groups/ST/crypto_apps_infra/csor/docum
> >> ents/ACES-CP-v3-2_signed_05122017.pdf) and the GSA approved IdenTrust
> CPS
> >> (https://secure.identrust.com/certificates/policy/aces/IdenT
> >> rust_ACES_CPS_v5.1_20161110.pdf)
> >> These ACES SSL certificates are issued to either U.S. Government
> agencies
> >> and/or their sub-contractors in support of government programs\projects.
> >> The
> >> CP requires an approved CA, such as IdenTrust, to identify U.S.
> Government
> >> in
> >> subject organizationName along with other applicable organizations (e.g.
> >> sub-contractors, or local government agency, etc...).
> >>
> >
> > If that's the case, I would expect each certificate to be authenticating
> > hostnames that are used solely to provide such services to the U.S.
> > Government. That doesn't appear to be the case with these.
> >
> > For example, one of them is for the homepage for a service provider:
> > www.mudiaminc.com
>
> What am I doing wrong?  goto https://www.mudiaminc.com/
> check the cert and it says
> Issued To
> Common Name (CN)*.opentransfer.com
> Organization (O)ECOMMERCE, INC.
>

You're not doing anything wrong, that hostname is just not using that
certificate at this time, at least not to public users. But issuance is
what matters here.

Given the capitalization of the common name, and the
organizationalUnitName, the certificate was clearly issued to the same
company.


> And one of them is for what appears to be a state government revenue
> > service's VPN: vpn.revenue.louisiana.gov
>
> I see that one - goto https://vpn.revenue.louisiana.gov/
> check the cert and it says
> Issued To
> Common Name (CN)Vpn.revenue.louisiana.gov
> Organization (O)U.S. Government
>
> > (So it's clear, "U.S. Government" only refers to the federal government,
> > not state/local/tribal governments.)
> >
> > I personally (and to be clear, this is in my individual capacity and I am
> > not representing my employer) think these are invalid organizationNames,
> > constitute misissuance, and that Identrust should be using the "U.S.
> > Government" only for hostnames providing services operated exclusively on
> > behalf of the federal government.
>

Re: Certificates issued with HTTPS OCSP responder URL (IdenTrust)

2017-08-09 Thread Eric Mill via dev-security-policy
On Tue, Aug 8, 2017 at 5:53 PM, identrust--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Tuesday, August 8, 2017 at 12:06:47 PM UTC-4, Jonathan Rudenberg wrote:
> > > On Aug 8, 2017, at 10:29, identrust--- via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
> > >
> > > On Monday, August 7, 2017 at 4:47:39 PM UTC-4, Jonathan Rudenberg
> wrote:
> > >> “IdenTrust ACES CA 2” has issued five certificates with an OCSP
> responder URL that has a HTTPS URI scheme. This is not valid, the OCSP
> responder URI is required to have the plaintext HTTP scheme according to
> Baseline Requirements section 7.1.2.2(c).
> > >>
> > >> Here’s the list of certificates: https://misissued.com/batch/4/
> > >>
> > >> Jonathan
> > >
> > > IdenTrust had previously interpreted HTTP to be inclusive of HTTPS in
> this
> > > context.  That being said, we have altered our profiles for
> certificates
> > > issued under this Sub CA to include only HTTP OCSP URLs.  All
> certificates
> > > issued going forward will contain an HTTP OCSP URL.  We will also
> examine all
> > > other sub CA to ensure only HTTP OCSP URLs are included.  Thank you
> for giving
> > > us an opportunity to address this with the community
> >
> > Thanks for the update.
> >
> > Can you also clarify why the subject organizationName is "U.S.
> Government” for all of these certificates, despite the other subject fields
> indicating organizations that are not a component of the US Government?
> >
> > Jonathan
>
> Yes,
> IdenTrust ACES SSL Certificates are issued in accordance with the ACES
> certificate policy defined by U.S. General Service Administration (
> http://csrc.nist.gov/groups/ST/crypto_apps_infra/csor/docum
> ents/ACES-CP-v3-2_signed_05122017.pdf) and the GSA approved IdenTrust CPS
> (https://secure.identrust.com/certificates/policy/aces/IdenT
> rust_ACES_CPS_v5.1_20161110.pdf)
> These ACES SSL certificates are issued to either U.S. Government agencies
> and/or their sub-contractors in support of government programs\projects.
> The
> CP requires an approved CA, such as IdenTrust, to identify U.S. Government
> in
> subject organizationName along with other applicable organizations (e.g.
> sub-contractors, or local government agency, etc...).
>

If that's the case, I would expect each certificate to be authenticating
hostnames that are used solely to provide such services to the U.S.
Government. That doesn't appear to be the case with these.

For example, one of them is for the homepage for a service provider:
www.mudiaminc.com

And one of them is for what appears to be a state government revenue
service's VPN: vpn.revenue.louisiana.gov

(So it's clear, "U.S. Government" only refers to the federal government,
not state/local/tribal governments.)

I personally (and to be clear, this is in my individual capacity and I am
not representing my employer) think these are invalid organizationNames,
constitute misissuance, and that Identrust should be using the "U.S.
Government" only for hostnames providing services operated exclusively on
behalf of the federal government.

-- Eric



> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Final Decision by Google on Symantec

2017-07-31 Thread Eric Mill via dev-security-policy
Given that we're past the 7/31 deadline and the comments in support of
following Chrome's lead, it sounds likely that that's what's happening. And
I think that's an understandable conclusion for Mozilla to draw, given the
compatibility risk Mozilla would be leading on for at least several months.

However, I think Mozilla should consider the larger precedent being set by
Mozilla deferring to such a significant relaxation in enforcement by Chrome
in such a complete way.

It's quite possible that Chrome's original proposed timetable was too
aggressive, but their final proposed timetable is quite weak, and it seems
like participants here generally agree that a partial distrust date in
December, preceding the holiday season, would be a reasonable conclusion.

I find it particularly disheartening that Symantec has been able to
successfully deploy hardball tactics to obtain more favorable treatment
from Google, and now likely Mozilla. As has been discussed amply on this
list, Symantec engaged the bare minimum necessary with the Mozilla
community, repeatedly missed or just made deadlines, and refused to answer
follow-up questions from community participants.

On at least one occasion, Symantec publicly pressured Mozilla to halt
public discussion about independent enforcement in favor of waiting for
Google's decision, from what appeared to be barely contained glee from
managing to get Google executives involved to slow down the process and
obtain a weaker proposal.

I also want to point out that Symantec's customer communication from around
July 11th, as shared on blink-dev:
https://groups.google.com/a/chromium.org/d/msg/blink-dev/
eUAKwjihhBs/smcHvd2HAgAJ

Instructs their customers to replace all of their certificates issued
before June 2016 by August 8th:


One aspect of Google’s proposal is that starting August 8, 2017, Chrome
would gradually begin mistrusting all Symantec branded certificates issued
before June 1, 2016.

We urge you take prompt action in order to avoid the risk of having your
certificates mistrusted by Google’s Chrome browser. At the end of this
email is an instruction to identify your certificates that are at risk, and
the date which Google has stated they may begin mistrusting them.

We recommend that you replace these certificates prior to August 8, 2017 to
minimize any disruption.


Symantec is referencing dates from a previous Chrome proposal by Ryan
Sleevi:
https://groups.google.com/a/chromium.org/d/msg/blink-dev/
eUAKwjihhBs/ovLalSBRBQAJ

But Chrome's proposal only references August 8th as the date Symantec
should be issuing all certificates from their managed PKI. The proposal
said that existing certs issued before June 2015 would be distrusted on
August 31st, and existing certs issued before June 2016 would be issued in
January 2018.

The net effect of Symantec's customer communication is that Symantec sent
its customers into a low-grade panic by waiting for almost 2 months from
the May proposal date to send them an email that, for most customers,
certainly appears to suggest that in 3 weeks, all their pre-June-2016 certs
will start causing errors.

The Symantec references a list of specific dates per-cert, which presumably
match Chrome's specific proposal, but I can tell you that I have observed
Symantec customers interpret this communication as an impending August 8th
distrust date for existing Symantec certificates in Chrome.

I find it quite plausible that Symantec deliberately encouraged unnecessary
anxiety among their customer base by delaying this notice and overstating
the severity of the distrust event, to validate their arguments about risk
to internet service availability and to strengthen their negotiating
position with Google.

But even if their intent was not quite so bad-faith, Symantec's handling of
this process was at the very least highly disorganized and belligerent, to
the point that I think Mozilla would be within their rights to lose
confidence in Symantec's future participation in the Mozilla root program.

So whatever Mozilla chooses to do, I hope that it reflects Mozilla's
independent assessment of the risk posed to their users by Symantec's
current certificate corpus and their expected participation in the program,
and that it reinforces Mozilla as an independent party in future
negotiations with other members of their root program.

-- Eric

On Fri, Jul 28, 2017 at 2:14 AM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Google have made a final decision on the various dates they plan to
> implement as part of the consensus plan in the Symantec matter. The
> message from blink-dev is included below.
>
> Most of the dates have consensus - the dates for Symantec to implement
> the Managed CA infrastructure are agreed by all, and the date for final
> distrust of the old Symantec PKI is agreed by Google and Mozilla (to
> within a week, at any rate). I proposed November 1st 2018. Google has
> gone for October 23rd 2018; 

Re: [EXT] Symantec Update on SubCA Proposal

2017-07-19 Thread Eric Mill via dev-security-policy
On Wed, Jul 19, 2017 at 11:31 AM, Steve Medin via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> > -Original Message-
> > From: dev-security-policy [mailto:dev-security-policy-
> > bounces+steve_medin=symantec@lists.mozilla.org] On Behalf Of
> > Jakob Bohm via dev-security-policy
> > Sent: Tuesday, July 18, 2017 4:39 PM
> > To: mozilla-dev-security-pol...@lists.mozilla.org
> > Subject: Re: [EXT] Symantec Update on SubCA Proposal
> >
> >
> > Just for clarity:
> >
> > (Note: Using ISO date format instead of ambiguous local date format)
> >
> > How many Symantec certs issued prior to 2015-06-01 expire after 2018-
> > 06-01, and how does that mesh with the alternative date proposed
> > below:
> >
> > On 18/07/2017 21:37, Steve Medin wrote:
> > > Correction: Summary item #3 should read:
> > >
> > > 3. May 1, 2018
> > > a. Single date of distrust of certificates issued prior to
> 6/1/2016.
> > (changed from August 31,2017 for certificates issued prior to 6/1/2015
> and
> > from January 18, 2018 for certificates issued prior to 6/1/2016).
> > >
>
> Over 34,000 certificates were issued prior to 2015-06-01 and expire after
> 2018-06-01. This is in addition to almost 200,000 certificates that would
> also need to be replaced under the current SubCA proposal assuming a May 1,
> 2018 distrust date. We believe that nine months (from August 1, 2017 to May
> 1, 2018) is aggressive but achievable for this transition — a period
> minimally necessary to allow for site operators to plan and execute an
> orderly transition and to reduce the potential risk of widespread ecosystem
> disruption. Nevertheless, we urge the community to consider moving the
> proposed May 1, 2018 distrust date out even further to February 1, 2019 in
> order to minimize the risk of end user disruption by ensuring that website
> operators have a reasonable timeframe to plan and deploy replacement
> certificates.
>

That's pretty close to saying that nothing should happen, since almost all
the certificates will have expired by then. That certainly is the least
disruptive, but it seems contrary to the intent of the proposal.

-- Eric


> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: WoSign new system passed Cure 53 system security audit

2017-07-09 Thread Eric Mill via dev-security-policy
So who acts as the CEO for WoSign when final executive decisions need to be
made?


On Sun, Jul 9, 2017 at 9:41 PM, Richard Wang via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Mr Wang is the COO now according to Mr. Tan's public announcement on March
> CAB Forum meeting.
>
> CEO is still N/A, if anyone is interesting in the CEO position, please
> send your Resume to Mr. Tan.
>
>
> Best Regards,
>
> Richard
>
> -Original Message-
> From: dev-security-policy [mailto:dev-security-policy-bounces+richard=
> wosign@lists.mozilla.org] On Behalf Of Itzhak Daniel via
> dev-security-policy
> Sent: Monday, July 10, 2017 4:57 AM
> To: mozilla-dev-security-pol...@lists.mozilla.org
> Subject: Re: WoSign new system passed Cure 53 system security audit
>
> Mr. Wang is mentioned on the end of the document, what is Richard Wang
> current official responsibility of Mr. Wang at WoSign?
>
> According to the incident report, release on October 2016 [1], Mr. Wang
> was suppose to be relieved of his duties as CEO, this is mentioned in 3
> separate paragraphs (P.17,P.25,P.26).
>
> Links:
> 1. https://www.wosign.com/report/WoSign_Incident_Report_Update_
> 07102016.pdf
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: StartCom issuing bogus certificates

2017-05-31 Thread Eric Mill via dev-security-policy
The content on example.com isn't important. An unauthorized certificate can
still potentially be used to intercept an HTTPS connection to example.com
and cause malicious behavior that is unrelated to the "real" content of
example.com.

I'm pushing on this because it's important to understand that a misissuance
like this is bad for more than just "compliance" reasons. It's the same
principle behind moving away from unencrypted connections generally -- even
"unimportant" sites benefit from the use of HTTPS, in part because it
closes off attack vectors that are present for all connections that fail to
treat the network as untrusted.

-- Eric

On Wed, May 31, 2017 at 8:48 PM, Yuhong Bao 
wrote:

> I don't think there is anything important on example.com though
> 
> From: Eric Mill 
> Sent: Wednesday, May 31, 2017 4:34:20 PM
> To: Jeremy Rowley
> Cc: Kurt Roeckx; Yuhong Bao; mozilla-dev-security-pol...@lists.mozilla.org;
> Matthew Hardeman
> Subject: Re: StartCom issuing bogus certificates
>
> It's absolutely not harmless to use example.com to
> test certificate issuance. People visit example.com
> all the time, given its role. An unauthorized certificate for example.com<
> http://example.com> could let someone other than its owner hijack user
> connections, and maliciously redirect traffic or inject code/content, same
> as for any other online service people use. It's an actual security
> problem, not just a compliance violation.
>
> -- Eric
>
> On Wed, May 31, 2017 at 3:18 PM, Jeremy Rowley via dev-security-policy <
> dev-security-policy@lists.mozilla.org security-pol...@lists.mozilla.org>> wrote:
> Agreed - the license to use the domain granted by IANA is only for
> inclusion
> in documents (https://www.iana.org/domains/reserved). There isn't a
> license
> to use the domain for testing or any other purposes.
>
> -Original Message-
> From: dev-security-policy
> [mailto:dev-security-policy-bounces+jeremy.rowley ev-security-policy-bounces%2Bjeremy.rowley>=digicert.com@lists.mozilla
> .org] On Behalf Of Kurt Roeckx via dev-security-policy
> Sent: Wednesday, May 31, 2017 11:55 AM
> To: Yuhong Bao  >>
> Cc: mozilla-dev-security-pol...@lists.mozilla.org la-dev-security-pol...@lists.mozilla.org>; Matthew Hardeman
> >
> Subject: Re: StartCom issuing bogus certificates
>
> On Wed, May 31, 2017 at 05:09:57PM +, Yuhong Bao via
> dev-security-policy
> wrote:
> > The point is that "misissuance" of example.com is
> harmless as they are
> reserved by IANA.
>
> But example.com is a real domain that that even has
> an https website. The
> certificate is issued by digicert, and the subject says it's to ICANN. If
> the certificate is not requested by IANA or ICANN nobody should issue a
> certificate for it.
>
>
> Kurt
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org security-pol...@lists.mozilla.org>
> https://lists.mozilla.org/listinfo/dev-security-policy
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org security-pol...@lists.mozilla.org>
> https://lists.mozilla.org/listinfo/dev-security-policy
>
>
>
>
> --
> konklone.com | @konklone konklone>
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: StartCom issuing bogus certificates

2017-05-31 Thread Eric Mill via dev-security-policy
It's absolutely not harmless to use example.com to test certificate
issuance. People visit example.com all the time, given its role. An
unauthorized certificate for example.com could let someone other than its
owner hijack user connections, and maliciously redirect traffic or inject
code/content, same as for any other online service people use. It's an
actual security problem, not just a compliance violation.

-- Eric

On Wed, May 31, 2017 at 3:18 PM, Jeremy Rowley via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Agreed - the license to use the domain granted by IANA is only for
> inclusion
> in documents (https://www.iana.org/domains/reserved). There isn't a
> license
> to use the domain for testing or any other purposes.
>
> -Original Message-
> From: dev-security-policy
> [mailto:dev-security-policy-bounces+jeremy.rowley=digicert.c
> om@lists.mozilla
> .org] On Behalf Of Kurt Roeckx via dev-security-policy
> Sent: Wednesday, May 31, 2017 11:55 AM
> To: Yuhong Bao 
> Cc: mozilla-dev-security-pol...@lists.mozilla.org; Matthew Hardeman
> 
> Subject: Re: StartCom issuing bogus certificates
>
> On Wed, May 31, 2017 at 05:09:57PM +, Yuhong Bao via
> dev-security-policy
> wrote:
> > The point is that "misissuance" of example.com is harmless as they are
> reserved by IANA.
>
> But example.com is a real domain that that even has an https website. The
> certificate is issued by digicert, and the subject says it's to ICANN. If
> the certificate is not requested by IANA or ICANN nobody should issue a
> certificate for it.
>
>
> Kurt
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
>


-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec: Draft Proposal

2017-05-07 Thread Eric Mill via dev-security-policy
On Sun, May 7, 2017 at 6:09 PM, Rick Andrews via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> I'm posting this on behalf of Symantec:
>
> We would like to update the community about our ongoing dialogue with
> Google.
>
> Following our May 4th post, senior executives at Google and Symantec
> established a new dialogue with the intention to arrive at a new proposal
> for the community that addresses the substantial customer impact that would
> result from prior proposals. We urge Symantec customers and the browser
> community to pause on decisions related to this matter until final
> proposals are posted and accepted.


This call for the browser community to not make any decisions until Google
and Symantec finalize and accept a proposal completely marginalizes and
ignores both Mozilla and the broader web community.

The "new dialogue" part also comes across as having gone over Ryan's head.
This is unfortunately consistent with Symantec's latest blog post, which
unprofessionally referred to proposals by "Mr. Sleevi" and "Mr. Markham".
These statements personalize the issue and marginalize the proposals by
casting them as individual opinions and not the views of their
organization. They also reinforce the perception that Symantec sees their
situation as the product of an unreasonable person or two and not the
result of their own errors.

This list just spent the last two weeks focused on a large host of issues,
curated by Mozilla on their wiki and discussed by the broader community
here. So far, all Symantec has done to publicly respond to those is to send
a single email per-issue, and then not otherwise participate in the
discussion beyond blog posts.

Posting a call to Mozilla's community list asking for Mozilla and its
community to pause while Symantec gets on the phone with senior Google
executives to work it all out is a baffling tactic. I hope Mozilla
continues to assert its stake in this process.

-- Eric

The intent of both Google and Symantec is to arrive at a proposal that
> improves security while minimizing business disruption across the community.
>
> We want to reassure the community that we are taking these matters and the
> impact on the community very seriously.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec: Draft Proposal

2017-05-06 Thread Eric Mill via dev-security-policy
On Thu, May 4, 2017 at 11:30 PM, Steve Medin via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Gerv, thank you for your draft proposal under consideration. We have posted
> our comments and detailed information at:
> https://www.symantec.com/connect/blogs/symantec-ca-
> continues-public-dialogue


(Posting in my personal capacity.)

Symantec says that Google's and Mozilla's proposals to impose a shorter
certificate lifetime will harm their CA business and cause customers to
move to other CAs.

The last time that Symantec was targeted for selective technical
enforcement was when Google imposed a CT requirement on Symantec-issued
certificates. Symantec had already set up a CT log and advocated for an
ecosystem-wide CT requirement before then, and responded to Google's
requirement by continuing this advocacy.

But in this case, Symantec is rejecting the premise and stating that to
impose a 13-month limit industry-wide would require automation and not be
feasible for enterprises, and lead to increased operating costs:

We also do not believe that a 13-month validity limit should be imposed on
the CA industry *at this time* – a conclusion that is reinforced by the
recent CA/Browser Forum vote rejecting ballot 185, which proposed to limit
the maximum validity of SSL/TLS certificates issued by all CAs to 13
months. As we have stated in our public response, many enterprises are not
at the level of automation maturity necessary to practically and
cost-effectively adopt shorter validity certificates. For these
organizations, standardizing on shorter validity certificates would present
substantial increases in their operating costs.


I believe that Symantec's assessment of this issue, expressed in this post
and in their public voting statement on Ballot 185 [1], is seriously
mistaken.

While it's certainly true that enterprises would experience some pain and
cost, Symantec states that 13-month certificates would either require
automation to use, or would create such a workload increase that IT shops
would have to hire staff. This is unpersuasive, as Mozilla and Google and
others (myself included) have tried to communicate throughout the various
discussions on this issue since January.

Everyone has recognized that a decrease to 90-day certificates would likely
create such a situation. However, as someone who has worked in very large
enterprises myself, I do not believe that moving to an annual renewal
schedule is infeasible for the enterprise community to handle.

Yes, it will cost them something, but the organizations that feel the pain
most acutely will logically be the largest ones -- and the largest
enterprises will also have the resources to respond appropriately.

As importantly, Symantec should be embracing changes that move enterprise
customers along the path towards automation. My experience is that the lack
of progress on automation is one of the most toxic and self-destructive
features of the enterprise IT sector. At scale, a reliance on error-prone
and unscalable human processes for basic infrastructure maintenance is a
massive contributor to defense being so much more expensive than offense
today.

Symantec's current proposal and blog post indicate that they are working to
create automation-friendly options for customers, but that's not nearly
sufficient to motivate the industry to change their behavior.

I believe that if Symantec changes their attitude and puts their full
weight behind shorter-lived certificates, it would indicate:

* A recognition that technical controls are superior to policy controls,
especially when a CA is of such a significant size that reliable policy
control enforcement becomes expensive.
* An understanding that Symantec's enterprise customers will always push
back on changes that create more work for them, but that Symantec's goal of
being an industry leader requires Symantec to lead their customers rather
than to follow their instructions.
* A belief that automation by default, on the part of both CAs and their
customers, is a collective action problem that is worth challenging the
industry to solve.

Those are the kinds of indicators that Mozilla and Google tend to weight
favorably in assessing the likelihood of future risk to users from a CA's
practices. So, I suggest that Mozilla and Google consider offering to drop
the portions of their proposals that limit Symantec's certificate lifetime,
if Symantec commits to supporting an industry-wide reduction in certificate
lifetimes to 13 months.

A commitment like this could take several forms, but to me it might look
like:

* Symantec publicly and privately asking the browser programs to impose an
industry-wide reduction by a reasonable date, whether or not a majority of
browsers support it, and whether or not 2/3 of CAs support it.
* Symantec proposing a ballot to impose this through the CA/Browser Forum's
Baseline requirements.
* Symantec immediately beginning to communicate to their customers the

Re: Symantec Conclusions and Next Steps

2017-04-28 Thread Eric Mill via dev-security-policy
On Fri, Apr 28, 2017 at 4:16 AM, Richard Wang via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> This Google decision’s problem is some big websites used a domain that not
> listed in Alexa 1M suffered disruption, for example, Qihoo 360’s search
> site and online gaming sites used a domain in CDN for pictures that not
> listed in Top 1M,
>

That's a plausible and interesting point about gauging impact to the Alexa
Top 1M. If the goal is to avoid affecting them, analyzing the resources
they pull from other origins has to be part of that.

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: [EXT] Re: Questions for Symantec

2017-04-21 Thread Eric Mill via dev-security-policy
On Thu, Apr 20, 2017 at 8:04 PM, Steve Medin via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

>
> > -Original Message-
> > On 03/04/17 13:11, Gervase Markham wrote:
> > > Hi Steve and Rick,
> >
> > Q9) Can you please tell us which audit covers the following two
> intermediate
> > CAs, which are subordinates of or cross-certified by VeriSign Universal
> Root
> > Certification Authority?
>
> These Intermediate CAs are sub-CAs under the Verisign Universal Root CA.
> They are covered under Symantec’s Non-Fed SSP audits, and the latest
> unqualified audits that we just received are being published.
>
> The customer-specific CAs (the subordinate ICAs) signed by these sub-CAs
> are path length constrained and operate fully within Symantec’s
> infrastructure. Under the Non-Federal SSP program, they are used to issue
> certificates for Microsoft Windows domain controllers and IPSec endpoints.
> End entity certificates issued under this program are designed only to
> contain Federal PKI policy OIDs and to exclude any CA/B Forum required
> policy OIDs.
>

For reference, the two links Gerv referenced were for unexpired
certificates issued by these two sub-CAs:

https://crt.sh/?Identity=%25=1384=expired
https://crt.sh/?Identity=%25=12352=expired

"pathlen:0" displays on crt.sh as a basic constraint for all certificates
listed there.

The FPKI cross-signs at issue in Issue L are now expired (and so don't show
on the links above). They do show when expired certificates are included --
there are 6 of them with OU=FPKI:
https://crt.sh/?Identity=%25=1384

Each of those certificates lack a pathlen:0 constraint, and appear to be
the only ones that do. Symantec noted that they are path length constrained
in their response, but since they also referenced Federal PKI policy OIDs
(which are not respected by Web PKI clients), I thought it was worth being
explicit about the difference between the certificates referenced here and
those referenced in Issue L.

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Policy 2.5 Proposal: Remove the bullet about "fraudulent use"

2017-04-21 Thread Eric Mill via dev-security-policy
I strongly support removing any ambiguity about CAs not being required to
police certificate issuance, and agree on the unuseful level of
subjectivity that would be present in any attempt to enforce this clause.

-- Eric

On Thu, Apr 20, 2017 at 7:11 PM, Matt Palmer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Thu, Apr 20, 2017 at 02:39:12PM +0100, Gervase Markham via
> dev-security-policy wrote:
> > So I propose removing it, and reformatting the section accordingly.
>
> Do t.  Do t nw!
>
> (That's me strongly agreeing with the proposal, in case my faux-Ren accent
> is impenetrable)
>
> - Matt
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Removing "Wildcard DV Certs" from Potentially Problematic Practices list

2017-04-21 Thread Eric Mill via dev-security-policy
Major +1. Removing this language is consonant with Mozilla objectives, with
Web PKI trends, and with the health of the open web.

-- Eric

On Thu, Apr 20, 2017 at 9:02 AM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> There is an entry on Mozilla's Potentially Problematic CA Practices list
> for Wildcard DV certs:
> https://wiki.mozilla.org/CA:Problematic_Practices#
> Wildcard_DV_SSL_Certificates
>
> This text was added by Frank Hecker when this page was very new back in
> 2008, and has been basically unchanged since then:
> https://wiki.mozilla.org/index.php?title=CA:Problematic_Practices=
> 92109=92084
>
> I don't believe the issuance of wildcard DV certs is problematic in
> practice. Mozilla is of the view that ubiquitous SSL is the highest
> priority for the Web PKI, and wildcard certs are a part of that. Mozilla
> also doesn't believe that it's the job of CAs to police phishing, which
> is the concern raised.
>
> I propose this section be removed from the document.
>
> Gerv
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec Response L

2017-04-16 Thread Eric Mill via dev-security-policy
For the benefit of the list, I'm the author of that text and that quote is
from this page, which is maintained by the General Services Administration
(though again, not by the Federal PKI team):

https://https.cio.gov/certificates/#does-the-us-
government-operate-a-publicly-trusted-certificate-authority%3f

The intended audience is federal agencies, and the intended takeaway is
that certificates from the Federal Common Policy CA should not be used for
TLS/HTTPS services where the expected client base is "the general public",
since the Federal PKI is not a member of the Mozilla root program.

Certificates from the Federal PKI can obviously be used where the client
base can be expected to trust its root CA, and there are many such uses of
the Federal PKI.

-- Eric

On Sun, Apr 16, 2017 at 8:50 PM, Peter Bachman via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Since we use ACES certificates for sending healthcare information in a way
> that mimimizes MITM, I was surprised to read the following.
>
>
> "The Federal PKI has cross-certified other agencies and commercial CAs,
> which means their certificates will be trusted by clients that trust the
> Federal PKI. However, none of these roots are publicly trusted. Even when a
> publicly trusted commercial CA is cross-certified with the Federal PKI,
> they maintain complete separation between their publicly trusted
> certificates and their Federal PKI cross-certified certificates.
>
> As a result, there is not currently a viable way to obtain an individual
> certificate for use in TLS/HTTPS that is issued or trusted by the Federal
> PKI, and also trusted by the general public."
>
> Source CIO Council
>
>
>
> The new ACES CP dated Jan 17 2017 does not assure public use of the ACES
> root.
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
Eric Mill
Senior Advisor, Technology Transformation Service, GSA
eric.m...@gsa.gov, +1-617-314-0966 <(617)%20314-0966>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec Response L

2017-04-12 Thread Eric Mill via dev-security-policy
On Wed, Apr 12, 2017 at 4:53 AM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 11/04/17 22:08, Eric Mill wrote:
> > I'll leave it to others to opine on the severity of the mistake and the
> > quality of the response, but I do want to at least properly communicate
> the
> > impact.
>
> Thank you. I have updated my write-up for Issue L.
>

Great. I see one inaccuracy in the text there right now:

When this was drawn to their attention, Symantec did not revoke the
cross-sign certificate under discussion, instead allowing it to expire
(less than a month later).


The cross-signature was brought to Symantec's attention in mid-February
2016. The certificate expired at the end of July 2016. The current text
says "less than a month later".

I believe that "less than a month later" is meant to reference the time
between when Symantec obtained concurrence from the Federal PKI about
undoing the cross-signature, and when the certificate expired.

Identrust revoked their similar cross-signature in mid-late February, a
week or so after being notified of the issue by Richard Barnes (then of
Mozilla).

-- Eric


>
> Gerv
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
Eric Mill
Senior Advisor, Technology Transformation Service, GSA
eric.m...@gsa.gov, +1-617-314-0966
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec Response L

2017-04-11 Thread Eric Mill via dev-security-policy
On Tue, Apr 11, 2017 at 6:37 AM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

>
> On 11/04/17 04:45, Eric Mill wrote:
>
> > But I think it's important to note that this relationship was not widely
> > understood or publicly discussed as part of the Mozilla trusted root
> > program, between 2009 and 2016.
>
> And you think that's bad?
>

An (interactive) picture might help illustrate what I'm pointing to. This
is the Federal PKI:
https://fpki-graph.fpki-lab.gov

There's something like 200 civilian, military, and non-government CAs in
there, connected through a huge number of bridges and cross-signatures.
Despite the name, the Federal PKI contains more than the federal government
-- within that graph are signatures bridging over to sector-wide PKIs such
as SAFE-BioPharma. In the center is the Federal Common Policy CA, which
ultimately everything can be chained up to.

For the time that the cross-signature was active (the one in question is
here - https://crt.sh/?id=12638543 and was ~8 months beginning in December
2015), all 200 of those CAs were capable of issuing a certificate that
would be technically trusted by users of the Mozilla root store. I haven't
looked to see whether there were other cross-signatures issued by VeriSign
or Symantec since the cross-signer's parent CA was admitted to the Mozilla
root store around 2009.

All that's been said here by Symantec on this issue's impact is that the
discussion around this made it clear that browsers don't respect
certificate policy identifiers (OIDs). Those policy identifiers would have
been, as I understand it, the sole technical constraint capable of
protecting users of the Mozilla trust store from mis-issuance from any of
these 200 CAs, had clients respected them.

I'll leave it to others to opine on the severity of the mistake and the
quality of the response, but I do want to at least properly communicate the
impact.

-- Eric


> There were several discussions about including the FPKI roots during
> this time, and about the problems that might cause. I might expect
> someone reading those, who knew that we already trusted bits (or all?)
> of the FPKI due to their actions, to say something...
>
> Gerv
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
Eric Mill
Senior Advisor, Technology Transformation Service, GSA
eric.m...@gsa.gov, +1-617-314-0966 <(617)%20314-0966>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec Response L

2017-04-11 Thread Eric Mill via dev-security-policy
On Tue, Apr 11, 2017 at 6:37 AM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Hi Eric,
>
> Perhaps you are being intentionally non-directive, in which case perhaps
> you can't answer my questions, but:
>

Yes, I am being intentionally non-directive. I'll leave the opinions to
others with more historical familiarity with the relevant programs and
policies.

-- Eric
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Symantec Response L

2017-04-10 Thread Eric Mill via dev-security-policy
On Mon, Apr 10, 2017 at 10:56 AM, Steve Medin via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Issue L: Cross-Signing the US Federal Bridge (February 2011 - July 2016)
>
> Symantec, as well as VeriSign, has participated in the FPKI since 2006,
> and we take our responsibility as a participant of this program very
> seriously. When Symantec began participating in FPKI, FPKI rules required
> two-way cross-certification in a networked PKI model. In addition, FPKI
> rules mandated multiple assurance levels, which we mapped to our Class 1,
> Class 2 and Class 3 roots. Class 3 roots are the only ones that have ever
> been enabled for TLS server certificate issuance.
>

A few things up front:

* My information could be incomplete.
* Symantec's responses to my questions when I brought this issue to their
attention in 2016 were always clear, professional, and timely.
* While we're at the same agency and we do collaborate, I don't work on the
Federal PKI team, and this message represents only my individual efforts
and not the Federal PKI or all of GSA.

But I want to add some color here and note that Symantec has a public
statement on m.d.s.p in December 2011 that seems to indicate that the root
which created the cross-sign in question came in through a VeriSign
purchase:

https://groups.google.com/forum/#!msg/mozilla.dev.security.policy/0xJClZlkO3w/CXjlamuOO-sJ

That root certificate's name ("VeriSign Class 3 SSP Intermediate CA - G2")
was never mentioned in Bugzilla, and was not discussed during the inclusion
of its parent CA ("VeriSign Universal Root Certification Authority"):
https://bugzilla.mozilla.org/show_bug.cgi?id=484901

While Symantec's CPS in 2016 mentions the Federal Bridge, the CPS that
VeriSign had at the time they submitted that parent CA to Mozilla's program
in 2009 does not mention the Federal PKI in any way:

https://web.archive.org/web/20090612085619/http://www.verisign.com/repository/CPSv3.8.1_final.pdf

I am not familiar with what Mozilla's policies were in 2009, and I know
there was a great deal of effort to draw attention to undisclosed
intermediates in 2016 -- that effort is what drew attention to these
cross-signatures.

But I think it's important to note that this relationship was not widely
understood or publicly discussed as part of the Mozilla trusted root
program, between 2009 and 2016.

In February 2016, Eric Mill prompted discussions with Symantec and the
> community about why the cross-certification resulted in some FPKI certs
> being trusted in browsers at https://github.com/18F/fpki-testing/issues/1.
> That discussion highlighted that browsers didn't process certificate policy
> extensions content during path building, while FPKI made extensive use of
> policy processing.


The discussion above is long and interesting, and definitely does highlight
that browsers don't process certificate policy extensions. The discussion
shows that this was a surprise to some participants. However, I would not
necessarily expect this to be a surprise to all participants in the web PKI
ecosystem.

We had already engaged with FPKI personnel to address this concern, and
> further engaged to determine if one-way cross-certification from FPKI to
> Symantec was sufficient, such that we could remove the cross-certification
> from Symantec to FPKI. On July 5, 2016,  FPKI notified Symantec that the
> cross-certificate, which was set to expire July 31, 2016, would not be
> required.


> Because we have a responsibility to our customers to ensure their
> businesses remain uninterrupted, we knew that communication and giving them
> adequate time to adjust to the unscheduled change in trust was critical. In
> order to effect minimal disruption, we allowed the cross-certificate to
> expire on July 31, 2016, rather than revoking it sooner.
>

Identrust was in a nearly identical position, having been asked about an
undisclosed cross-signature of the FPKI at the same time as it was pointed
out to Symantec:

https://bugzilla.mozilla.org/show_bug.cgi?id=1037590#c21

I am not aware what differences may exist between Symantec's and
Identrust's arrangements with the Federal PKI. However, Identrust made a
prompt decision and revoked the certificate by February 19th.

-- Eric


> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
Eric Mill
Senior Advisor, Technology Transformation Service, GSA
eric.m...@gsa.gov, +1-617-314-0966
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: 360 team hacks Chrome

2017-03-06 Thread Eric Mill via dev-security-policy
I'll include Richard Barnes' response to cabfpublic here too, for
completeness:

-- Forwarded message --
From: "Richard Barnes via Public" 
Date: Mar 6, 2017 8:58 AM
Subject: Re: [cabfpub] 360 team hacks Chrome
To: "CA/Browser Forum Public Discussion List" 
Cc: "Richard Barnes" 

Richard: Is there any particular reason you're posting year-old security
news here?

To add some context for those who might not be familiar with pwn2own,
"Hacked in 11 minutes" is not a surprising result. Most browsers that are
included in pwn2own get hacked (most targets in general).  The bounty is
rich enough that vulnerability researchers put significant effort into
preparation.  It's an important way that browser vendors find out about
security exploits.

Pwn2own 2017 is in a couple of weeks:

http://zerodayinitiative.com/Pwn2Own2017Rules.html




On Mar 6, 2017 9:19 AM, "Richard Wang via dev-security-policy" <
dev-security-policy@lists.mozilla.org> wrote:

Sorry, I posted an old news that I just saw it.
Please ignore it.

Best Regards,

Richard

> On 6 Mar 2017, at 21:45, Richard Wang via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:
>
> Pwn2Own 2016: Chinese Researcher Hacks Google Chrome within 11 minutes
> http://www.prnewswire.com/news-releases/pwn2own-2016-
chinese-researcher-hacks-google-chrome-within-11-minutes-300237705.html
>
>
> Best Regards,
>
> Richard
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: A new US government CA for the web PKI

2017-03-05 Thread Eric Mill via dev-security-policy
On Fri, Mar 3, 2017 at 6:25 AM, Gervase Markham via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 02/03/17 20:45, Eric Mill wrote:
> > Our goal is to start a new root and set of issuing CAs that is completely
> > disconnected and separate from the existing Federal PKI bridge network
> that
> > members of the web PKI community may be familiar with.
>
> Are you able to say whether you will be seeking a cross-sign from an
> existing publicly-trusted cert to bootstrap your ubiquity?
>

That's definitely being considered, as it would be an obvious way to
accelerate the utility of a new CA intended for public trust.


> I note that some chap called Eric commented a couple of years ago that
> newly-added certificates would take a long time to be well enough
> distributed for USG websites to rely on them:
> https://bugzilla.mozilla.org/show_bug.cgi?id=478418#c70
> :-)
>

Seems like a reasonable guy...


> > government operated devices, and so we welcome appropriately narrow name
> > constraints that reflect that.
>
> Will you be encoding these constraints in your roots and/or
> intermediates, or will you be requesting that people shipping your roots
> impose them externally?
>
> If you are considering putting them in the roots, you may want to talk
> to HARICA, who attempted this and (I believe) ran into one or two issues.
>

That's the exact kind of question for which we could really use community
input.

We do have a general discussion thread open, with GSA and DoD staff
contributing, to discuss the breadth of the constraints and potential
implementation issues:
https://github.com/uspki/policies/issues/12

I know I definitely don't have a complete understanding of client support
and failure modes for in-certificate constraints in today's ecosystem.
Breadth of enforcement is a factor, and so is breadth of support and
reliability.


>
> > Since we’re not yet an applicant, this forum may not be the best place
> for
> > an extended discussion (though we’re happy to engage in discussion here
> if
> > people would like)
>
> This forum hosts general WebPKI discussion; you are welcome to keep us
> updated on your progress.
>

Thank you!

-- Eric


>
> Gerv
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
Eric Mill
Senior Advisor, Technology Transformation Service, GSA
eric.m...@gsa.gov, +1-617-314-0966
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


A new US government CA for the web PKI

2017-03-02 Thread Eric Mill via dev-security-policy
Hi all,

Though we’re not at the point of filing an application for Mozilla’s root
program, I wanted to share with this community the beginnings of an effort
by the US government to start a new PKI intended for publicly trusted
certificates. This effort is being led by the General Services
Administration and the Department of Defense.

Our goal is to start a new root and set of issuing CAs that is completely
disconnected and separate from the existing Federal PKI bridge network that
members of the web PKI community may be familiar with. The existing Federal
PKI is used to issue many kinds of certificates, including those used for
enterprise devices and for government personal identity verification (PIV).

This new hierarchy would focus only on certificates intended for devices on
the internet, rather than people, and their operation and policies are
intended to adhere strictly to web PKI requirements, as expressed through
the CA/Browser Forum’s Baseline Requirements and those of various root
programs. In addition, this hierarchy is intended only to serve US
government operated devices, and so we welcome appropriately narrow name
constraints that reflect that.
.
While we’re still in the early stages, we are working on the root policy
documents -- including a CP, CPS, and various certificate profiles -- in
public on GitHub:

https://github.com/uspki/policies

One additional thing I’d like to mention is that we’re fully in support of
the goals of Certificate Transparency. This project was initiated prior to
Chrome announcing its October 2017 CT requirement, and our intent from the
beginning has been to log 100% of issued certificates, with no special need
for redaction. As part of this, we are evaluating the possibility of
creating a new CT log that can issue SCTs considered valid by browsers for
policy enforcement.

We generally intend the issuing CAs to support automated certificate
issuance, which includes evaluating existing standard protocols. In
general, we expect to use and support open standards and open source tools
where they support the effort.

Since we’re not yet an applicant, this forum may not be the best place for
an extended discussion (though we’re happy to engage in discussion here if
people would like), but we’re actively seeking public participation and
input during the process -- issues and pull requests to the GitHub
repository above are quite welcome, and we’ll create additional repos as we
go for other parts of the project.

As we make progress, we hope to contribute positively to the web PKI and CT
ecosystem, and we plan to be engaging publicly with the community here and
other places along the way.

-- Eric

(P.S. This is my first email to the list from my work .gov address, so I'll
just quick note that that means I'm speaking in my work capacity. Emails
that are not from my work address are not speaking in my work capacity.)

-- 
Eric Mill
Senior Advisor, Technology Transformation Service, GSA
eric.m...@gsa.gov, +1-617-314-0966
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Let's Encrypt appears to issue a certificate for a domain that doesn't exist

2017-02-23 Thread Eric Mill via dev-security-policy
This list hosted an extensive discussion on this issue in May of 2016,
subject line "SSL Certs for Malicious Websites":

https://groups.google.com/d/topic/mozilla.dev.security.polic
y/vMrncPi3tx8/discussion

Most (all?) of the people on this thread participated on that one, and said
most (all?) of these things. It's probably not worth rehashing it in a new
thread that started on a different topic (misissuance to a non-existing
domain) that is now resolved.

-- Eric

On Thu, Feb 23, 2017 at 6:29 PM, Matt Palmer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Thu, Feb 23, 2017 at 03:55:43AM +, Richard Wang via
> dev-security-policy wrote:
> > If "apple", "google", "Microsoft" is not a high risk domain, then I
> don’t know which domain is high risk domain, maybe only "github".
>
> That's kinda the problem: you don't know, and neither does anyone else,
> because there's no agreed-upon definition or policy for what constitutes a
> "high risk domain".
>
> - Matt
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: SHA-1 serverAuth cert issued by Trustis in November 2016

2017-02-16 Thread Eric Mill via dev-security-policy
On Thu, Feb 16, 2017 at 8:26 PM, blake.morgan--- via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:
>
>
> Trustis has now revoked the SHA-1 Certificate for hmrcset.trustis.com and
> replaced it with a SHA-256 Certificate.  This status is reflected in the
> latest CRL.
>

Blake, respectfully, that's not very much detail. That doesn't describe how
the certificate was issued contrary to Mozilla policy, nor what changes
Trustis may have made to ensure it doesn't happen again.

-- Eric

>
> Kind regards,
>
> Blake Morgan
> Trustis Ltd
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>



-- 
konklone.com | @konklone 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Misissued/Suspicious Symantec Certificates

2017-02-12 Thread Eric Mill via dev-security-policy
Also relevant are Symantec's statements about two E regional auditors.

One section describes contradictions from E KR (Korea) in describing why
some CrossCert issuing CAs were not in scope:

• The list of CAs in the audit was produced by CrossCert and given to E
KR as the scope to audit. It was not given to E by Symantec.

• E KR initially stated that CrossCert did not fully disclose the list of
CAs. E KR later stated that CrossCert provided a list of all their
issuing CAs but reduced the list of issuing CAs in scope of sampling for
budgetary reasons.

• Due to these conflicting statements and further discoveries explained
below, Symantec will no longer accept audits from E KR.


And a second section is about contradictions and delays in describing the
scope of an audit that E BR (Brazil) performed on Certisign:

E BR produced two deficient letters regarding the 2014 and 2015 Certisign
audits. Initially we received a letter that stated a January 1, 2014 to
December 31, 2014 audit period in its introduction and a January 1, 2014 to
December 31, 2015 audit period in its conclusion. The letter appeared to
cover a two year period. We asked for clarification multiple times. That
clarifying letter stated a 2015 audit period.

E BR does not meet our requirements for RA audit quality, timeliness, and
responsiveness to our demands. Symantec will no longer accept audits from
E BR should we have a future need for in-market audit support.




On Sun, Feb 12, 2017 at 2:11 PM, Eric Mill  wrote:

> Though Nick's email implies the announcement, for the benefit of the list,
> here's Symantec's introduction at the top of their response:
>
> Based on our investigation of CrossCert, we have concerns due to (1)
> demonstrated non-compliance with processes and controls, (2) assertions of
> third party auditors that need far greater oversight than we previously
> expected, and (3) the fact that these issues have enabled cases of
> certificate mis-issuance. As a result, we have made the decision to
> terminate our partner RA program.
>
> We will continue to work with select partners that have local market
> contacts and expertise to facilitate an interface with customers and
> collection of relevant documentation, however Symantec personnel will
> validate 100% of all asserted identity data and control certificate
> issuance going forward. We have communicated this change to each of our RA
> partners, we are finalizing a transition plan, and intend to implement that
> transition quickly.
>
> In addition, to alleviate any concern by customers or relying parties on
> the integrity of the certificates issued by these RA partners, Symantec
> will review the validation work of 100% of issued certificates and
> revalidate any where we identify any deficiency. Certificates issued with
> deficient validation will be replaced and revoked. Our work will be
> included in scope of our next WebTrust audits.
>
>
> On Sun, Feb 12, 2017 at 1:02 PM, Nick Lamb via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>> On Sunday, 12 February 2017 15:28:26 UTC, Steve Medin  wrote:
>> > A response is now available in Bugzilla 1334377 and directly at:
>> > https://bugzilla.mozilla.org/attachment.cgi?id=8836487
>>
>> Thanks for these responses Steve,
>>
>> I believe that Symantec's decision to terminate the RA Partner programme
>> was a good one, not only in light of what's been found during this specific
>> investigation, but also because it makes the CA function within Symantec
>> simpler. It definitely feels as though some of the issues (big and small)
>> with Symantec's CA function in the past few years grew out of complexity.
>> Simpler systems are easier to correctly reason about and thus to manage
>> properly.
>>
>> Simpler systems are also easier for the Root Programmes to oversee and
>> for the Relying Parties to put their trust in. This group has fought
>> against the presumption that "foreign" CAs are necessarily less
>> trustworthy, but the fact is that a person who was happy with a Symantec
>> certificate on the basis that it was issued by a famous US Corporation
>> might have been very surprised to learn the decision to issue was actually
>> taken by a company they've never heard of in Korea, or Brazil.
>>
>> Given Symantec's experiences here, I would recommend that Mozilla's
>> routine letter to CAs might ask them if they have any similar programme and
>> if so what measures they have in place to ensure their RAs or similar Third
>> Parties are really living up to the standards Mozilla requires. Depending
>> on the responses this might need further action from Mozilla. It would also
>> make sense to ask about this during new CA enrollment. There's maybe a
>> small piece of work here to figure out what sort of characteristics best
>> distinguish something like Symantec's relationship with Crosscert from
>> unremarkable business practices like corporate accounts to issue many
>> certificates without