Re: About upcoming limits on trusted certificates

2020-03-16 Thread Gijs Kruitbosch via dev-security-policy

On 14/03/2020 18:53, Nick Lamb wrote:

my assumption is that at
best such a patch would be in the big pile of volunteer stuff maybe
nobody has time to look at.


Tangential: perhaps there's an aspect of phrasing here that is confusing 
me, but this reads to me as suggesting we don't review/work with 
volunteer code contributions, and I'd like to be explicit and say that 
we do our best to do so and I am unaware of big piles of un-looked-at 
volunteer-contributed patches (having been such a volunteer myself in 
the past).


I can't speak for the crypto team (though it looks like Kathleen has 
relayed an answer for the concrete bug you asked about), but if you know 
of Firefox patches that are sitting without due attention, please feel 
free to nudge me. And no, that approach might in theory not scale, which 
is why other folks are building better tooling to ensure we don't end up 
with trees falling in forests unheard, as it were. But in the meantime, 
feel free to ping me (off-list).


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Website owner survey data on identity, browser UIs, and the EV UI

2019-09-22 Thread Gijs Kruitbosch via dev-security-policy
(For the avoidance of doubt, although I work for Mozilla, as noted on 
the wiki I post in a personal capacity)


In addition to Ryan's excellent points, I wanted to briefly point out a 
few things related to your survey:


On 22/09/2019 00:52, Kirk Hall wrote:

(1) *97%* of respondents agreed or strongly agreed with the statement: "Customers / 
users have the right to know which organization is running a website if the website asks 
the user to provide sensitive data."


Although I intuitively would like to think that we have a right to know 
"who is running a website", this doesn't mean that EV certificate 
information is an appropriate vehicle for this information. Even without 
all the significant issues that EV certification has, if we pretended it 
was perfect, it still only shows UI for the tls connection made for the 
toplevel document, whereas other resources and subframes could easily 
have (and usually do) come from other domains that either do not have an 
EV cert or have one belonging to a different entity. And even if that 
were not the case, the entity controlling the website does not 
necessarily control the data in a legal sense.*** So the EV UI does not, 
in the legal sense, always indicate who will control the "sensitive 
data" that users/customers submit.



(2) *93%* of respondents agreed or strongly agreed with the statement “Identity 
on the Internet is becoming increasingly important over time..


This sounds very nice but doesn't mean anything. What kind of identity? 
Whose identity? Important to whom? Why does it have anything to do with EV?



(3) When respondents were asked “How important is it that your website has an 
SSL certificate that tells customers they are at your company's official 
website via a unique and consistent UI in the URL bar?” *74%* said it was 
either extremely important or very important to them. Another *13%* said it was 
somewhat important (total: *87%*).


This again sounds very nice, but surely the actually important thing is 
that (potential) customers realize when they are *not* at that official 
website when some other website tries to persuade them to part with 
their data/money (so that they don't, or if they do, don't blame the 
"real" company later)? As has been pointed out repeatedly in this forum, 
we have pretty good evidence that customers do not, in fact, realize the 
absence of the EV indicator, as well as evidence that such indicators 
can be "spoofed", viz. the Stripe Inc. work.


If anything, this survey shows that the 87% of people who thought this 
was important misunderstood where the risks of digital identity 
confusion lie.



(4) When respondents were asked “Do you believe that positive visual signals in 
the browser UI (such as the EV UI for EV sites) are important to encourage 
website owners to choose EV certificates and undergo the EV validation process 
for their organization?” *73%* said it was either extremely important or very 
important to them. Another *17%* said it was somewhat important (total *90%*).


This implies that the UI is the/a main motivator for people to get these 
certificates, but doesn't by itself have any implications for the 
importance of that UI in keeping consumers and businesses safe.


If 90% of people surveyed think that people should wear helmets when 
cycling, that's good for people selling bicycle helmets but doesn't have 
anything to do with how effective those helmets are at preventing 
injuries in cyclists.



(5) *92%* agreed or strongly agreed with the statement: “Web browser security 
indicators should be standardized across different browsers to make the UI 
easier for users to understand.”

(6) Finally, when asked “Do you think browsers should standardize among 
themselves on a common Extended Validation UI so that it appears roughly the 
same in all browsers?” *91%* said yes.


Both of these actually appear to be arguments for Firefox not to 
reinstate its in-address-bar EV UI, given that all the other browsers 
have moved this information out of there. The most consistent UI is only 
providing this information when activating (clicking/tapping/...) the 
lock icon, which is what browsers have now pretty universally implemented.



We again recommend the binary Apple UI to all browsers, which works in both 
desktop and mobile environments and distinguishes between EV/identity sites 
(with a green lock symbol and URL) and DV/anonymous sites (with a black lock 
symbol and URL) – check it out in an iPhone.  (Apple did not eliminate the EV 
UI, as some has erroneously said.)  This is easy for users to understand at a 
glance.


With due respect to the good folks at Apple, I do not believe this is an 
accessible solution (distinguishing information only by colour, 
https://www.w3.org/TR/WCAG20/#visual-audio-contrast ).


Additionally, (even if we presuppose EV certs were perfect) it does not 
help address the requests made in your survey's questions #1 and #3, ie 
which organization is 

Finance analogies for root stores (was: Re: DarkMatter Concerns)

2019-07-22 Thread Gijs Kruitbosch via dev-security-policy
(I'm splitting the topic because at this point, continuing to discuss 
the analogy doesn't have a direct bearing on the inclusion or otherwise 
of DM)


Replies inline.

On 16/07/2019 23:23, Matthew Hardeman wrote:

I submit that I disagree somewhat with Gijs' suggestion that Mozilla acts
in the nature of a third-party guarantor here.  I further submit that the
more direct analogue is that the community of Mozilla users present
and future is the set of depositing members of the Mozilla Trust Credit
Union, a bank of trust/credit which is lended out to CAs from the pool of
trust + good will of those users -- that pool being under the direction and
management of the Mozilla organization, who, I believe, are literally
acting in the nature of a lender, loaning out the pooled assets (in this
case the sum of the trust extended to Mozilla) to qualified trust-borrowers
(CAs).  Mozilla is explicitly in the position of making decisions regarding
where to invest that pooled trust.


But as I already said, unlike financial institutions the trust store cannot:

- moderate the amount of trust/money
- tailor whatever the equivalent of repayments or interest would be to 
the profile of risk of a particular debtor (CA)

- demand a security from the debtor (CA)
- demand a guarantor who is liable if the debtor (CA) "defaults".

Instead, it is bound to treat every admitted debtor (CA) exactly the 
same, and a yes/no is the only decision it can make. That's... not 
really any power over the debtor at all, besides denying them access to 
the users ("pool of trust") *for the future only*. That is, at any point 
the trust store can stop providing further trust (credit). But there's 
almost nothing it can do about trust (credit) that's already been 
provided (certainly, in Mozilla's case, about TLS transactions in the 
past involving certs issued by the debtor/CA).



Indeed, if Mozilla is a mere guarantor in this process, who precisely is
the lender?


The lenders are the users, the CAs are debtors. Same as your example, 
really, except you're putting the trust store in a position where you 
claim it can control the debtors (ie as a credit union of users' trust), 
and I'm pointing out that such control does not exist in the same way as 
it does for lenders in the financial system. Instead the community's 
position wrt individual CAs once trust is broken is much more like 
someone who has to clean up after the debtor disappears (by 
investigating, discussing, then drawing conclusions, then 
distrusting-for-the-future and making sure both end users and site 
operators aren't unduly inconvenienced by the distrust of the 
"defaulting" CA). That's effectively the position of a guarantor - who 
can also refuse to act as a guarantor when a loan is setup, ie has a 
yes/no choice, but otherwise can't influence the agreement between 
lender and borrower itself.



I also disagree with the contention that Mozilla has "effectively no
recourse" should a trust "debtor" (CA) "default" (fail to make "payments"
on the borrowed trust through providing services to certificate subscribers
only in compliance with program and industry guidelines and with proper
validations.)  Mozilla's recourse is essentially absolute: you can revoke
the trust you've extended, preventing further damage.


As you say, we can "prevent **further** damage". But there is no 
recourse for damage that has already happened. The potential for damage 
cannot be limited pre-emptively, nor is it possible to create incentives 
for the debtors to repay (ie no way to have a security) and there is no 
way to be compensated for damage caused once that's happened. Instead 
the trust store (here, Mozilla and the community in mdsp) get to clean 
up - and will refuse to sign up as guarantor for the same entity unless 
things change drastically, because we're not *that* stupid... And sure, 
that prevents future damage, but it does nothing to remedy damage in the 
past, which is not normally how things work for financial credit.


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-07-11 Thread Gijs Kruitbosch via dev-security-policy

On 11/07/2019 03:38, Matthew Hardeman wrote:

I used
the parallel to racism in finance because it's exceedingly well documented
that strong objective systems of risk management and decisioning led to
better overall financial outcomes AND significantly opened the door to
credit (aka trust) to otherwise improperly maligned and underserved
communities.


(for the avoidance of doubt: writing in a personal capacity - although I 
work for Mozilla I have nothing to do with this decision.)


Financial credit really isn't "aka trust".

The "strong objective system of risk management and decisioning" 
includes the ability to risk manage (e.g. in determining the amount of 
credit, the interest rate, including a guarantor, including a security, 
requiring certain types of insurance so the creditor doesn't lose out if 
the debtor dies, ...), and there's no way for a trust store to "risk 
manage" a CA in any of those ways. Mozilla can't limit issuance to a 
certain number of certificates, or a certain set of domains, or set 
financial penalties for misissuance, or ...


Additionally, the repayments to credit once an agreement is struck 
provide complete information about current performance of the debtor, 
which there isn't in the CA world. And should repayments stop, the 
lender normally has some means of recuperating losses (whether that's 
through the object which secured the loan, through the guarantor, or the 
court/bailiff system), and the only people affected are the lender and 
the debtor (and guarantor, if any). None of that is true for a trust 
store, either, where the people affected by a "default" are the relying 
parties.


If we're going to make a comparison to finance, this is more akin to 
Mozilla being asked to sign up as guarantor for every CA, in a huge loan 
that's being extended by all the users of their trust store. Any 
financial adviser worth their salt will tell you never to be a guarantor 
for anybody unless you're very, very sure of that person, because you 
have effectively no recourse if the debtor leaves you holding the bag.


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Possible DigiCert in-addr.arpa Mis-issuance

2019-03-02 Thread Gijs Kruitbosch via dev-security-policy

On 02/03/2019 08:45, Cynthia Revström wrote:

On 2019-03-02 01:49, George Macon via dev-security-policy wrote:


One specific question on this point: Why did the software permit setting
the approval scope to a public suffix (as defined by inclusion on the
public suffix list)? Could validation agent action set the approval
scope to some other two-label public suffix like co.uk?


I think this is highly unlikely seeing as this was a human error and 
unlike in-addr.arpa, people might know about .co.uk.


But the PSL is very large (by human, not machine, standards) and most 
humans will not be familiar with most/all of the entries on the list. 
Note for instance that (most/all of?) AWS is represented in one way or 
another, as are other hosting services that are much less well-known. It 
seems worth checking the PSL automatically, and it's curious that such 
checks were not present or did not prevent/discourage the agent from 
acting as they did.


(Note that I'm not overly familiar with the BR and various other 
guidelines, and under what circumstances issuance to entries in the PSL 
is/isn't permitted, but intuitively it seems like a red flag once we're 
talking about manual (rather than automatically verified) issuance.)


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Incident report - Misissuance of CISCO VPN server certificates by Microsec

2018-12-05 Thread Gijs Kruitbosch via dev-security-policy

On 05/12/2018 19:45, Wayne Thayer wrote:

..On Wed, Dec 5, 2018 at 1:58 PM dr. Sándor Szőke via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:
6./

Explanation about how and why the mistakes were made or bugs introduced,
and how they avoided detection until now.

Microsec manages the CISCO VPN cerver certificates separately from the TLS
certificates. The policy of the CISCO VPN servers was not changed when the
validity of the TLS certificates changed from 3 years to 2 years in March
2018.



Why wasn't the policy for Cisco VPN servers updated? This points to a
deeper failure to properly manage all of the profiles used to issue
certificates that chain to publicly-trusted roots, and I would like to
better understand what went wrong and how it will be prevented in the
future?


Adding some more questions on to this: does Microsec have any other 
non-TLS cert policies that they "manage separately" from the TLS ones 
(no matter how infrequently used), and if so how many, and have you 
verified how any of those might qualify as TLS certs and thus fall under 
the BR, and if so, that they abide by this validity BR?


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Incident report D-TRUST: syntax error in one tls certificate

2018-11-26 Thread Gijs Kruitbosch via dev-security-policy

(for the avoidance of doubt: posting in a personal capacity)

On 23/11/2018 15:24, Enrico Entschew wrote:

Timeline:
2018-11-12, 10:30 UTC Customer was contacted the first time. Customer runs an 
international critical trade platform for emissions. Immediate revocation of 
the certificate would cause irreparable harm to the public.



2018-11-22, 16:08 UTC The certificate with the serial number 3c 7c fb bf ea 35 
a8 96 c6 79 c6 5c 82 ec 40 13 was revoked by customer.


Some questions I have:

1) Don't the BR specify CAs MUST revoke within 24 hours (for some 
issues) or 5 days (for others)? This looks like just over 10 days, and 
was customer-prompted as opposed to set by the CA, it seems. Am I just 
missing the part of the BRs that says ignoring the 5 days is OK if it's 
"just" a syntax error?


2) what procedure does D-TRUST follow to ensure adequate revocation 
times, and in particular, under what circumstances does it decide that 
not revoking until the customer gives an OK is necessary (e.g. how does 
it decide what constitutes an "international[ly] critical" site)? Is 
this documented, e.g. in CPS or similar? Have auditors signed off on that?


3) can you elaborate on the system being down causing "irreparable 
harm"? What would have happened if the cert had just been revoked after 
24/120 hours? In this case, the website in question ( www.dehst.de ) has 
been broken in Firefox for the past 64 or so hours (ie since about 6pm 
UK time on Friday, when I first read your message) because the server 
doesn't actually send the full chain of certs for its new certificate. 
Given that the server (AFAICT) doesn't staple OCSP responses, I don't 
imagine that practical breakage in a web browser would have been worse 
if the original cert had been revoked immediately, given the CRL 
revocation done last week hasn't appeared in CRLSet/OneCRL either.


~ Gijs

___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: On the value of EV

2017-12-19 Thread Gijs Kruitbosch via dev-security-policy

On 18/12/2017 21:54, Andrew wrote:

On Monday, December 18, 2017 at 3:09:31 PM UTC-6, Wayne Thayer wrote:

Thank you Ryan for raising this question, and to everyone who has been
contributing in a constructive manner to the discussion. A number of
excellent points have been raised on the effectiveness of EV in general and
on the practicality of solving the problems that exist with EV.

While we have concerns about the value of EV as well as the potential for
EV to actually harm users, Mozilla currently has no definite plans to
remove the EV UI from Firefox. At the very least, we want to see
Certificate Transparency required for all certificates before making any
change that is likely to reduce the use of EV certificates.

Is Google planning to remove the EV UI from desktop Chrome? If so, how does
that relate to the plan to mark HTTP sites as ‘Not secure’ [1]? Does this
imply the complete removal of HTTPS UI?

While we agree that improvements to EV validation won’t remove many of the
underlying issues that have been raised here, we hope that CAs will move
quickly to make the EV Subject information displayed in the address bar
more reliable and less confusing.

- Wayne

[1]
https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html


So, given that Mozilla has no immediate plans to remove the EV UI from Firefox, 
perhaps the UI should be adjusted to include the state the Subject is 
registered in on the EV badge. No reason for that text to be any more 
misleading than necessary. (I assume this is something we can pretty much all 
agree on, yes?)


As people have already mentioned, states aren't necessarily that 
informative even within the US. Plus it opens up other phishing-y 
avenues, like registering a California company that matches some 
Canadian company's name. So it's not clear that would be an improvement, 
and certainly not a *strict* improvement -- even before factoring in 
screen real estate, development and testing effort required, etc.


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: On the value of EV

2017-12-13 Thread Gijs Kruitbosch via dev-security-policy

On 13/12/2017 14:50, Tim Shirley wrote:

I guess I’m also having a hard time appreciating how the presence of this 
information is a “cost” to users who don’t care about it.  For one thing, it’s 
been there for years in all major browsers, so everyone has at least been 
conditioned to its presence already.  But how is someone who isn’t interested 
in the information in the first place being confused by it?  And if the mere 
presence of an organization name is creating confusion,
In addition to what Ryan said, speaking as an engineer who's worked on 
the Firefox URL bar, the EV indicator also has a non-trivial cost in 
terms of implementation/UI-design complexity.


On a purely practical level, displaying a longer EV entity string 
implies less of the actual URL string is visible to the user, which in 
itself is a risk for phishing.



 then surely a URL with lots of words and funny characters in it would be 
confusing people too, and we should remove that too, right?


I know you're speaking in jest, but yes. This is exactly why Safari 
doesn't show the URL path/querystring etc. in the URL bar when the URL 
isn't being edited (only the domain and/or EV name). We may or may not 
end up doing something similar (ie lose path/querystring/hash) in 
Firefox, but either way there are definitely reasonable arguments for 
doing something along those lines.


Going further off-topic, as people have already implied, perhaps we want 
other trust UI that provides more meaningful information to users about 
the trust status of a page, that is easier to understand than a URL or 
scheme/hostname/port combination. But we don't need to block removing EV 
UI on that if there's consensus that EV UI doesn't add (sufficient) 
value to remain in browsers.


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: dNSName containing '/' / low serial number entropy

2017-08-11 Thread Gijs Kruitbosch via dev-security-policy

On 11/08/2017 15:39, Policy Authority PKIoverheid wrote:

2. Why did DDY not implement the serial number entropy as required by the 
Baseline Requirements?
3. Was this detected by the auditor? If not, why not?





ANSWER ON QUESTION 2:
DDY concluded wrongly that ballot 164 was not applicable for them since the use 
of sequential serial numbers is not a security risk when used in conjunction 
with the SHA-256 with RSA encryption certificate signing scheme.
ANSWER ON QUESTION 3:
Non-compliance with this requirements wasn’t noticed by the auditor because DDY 
didn’t include the specific requirement in their Statement of Applicability 
(reason: see the answer on question 2). ETSI EN 319 403 (which determines the 
requirements for conformity assessment bodies) is not clear about who 
determines the scope of an audit. The auditor’s interpretation was that the 
client (DDY) had to determine the scope of the audit (based on their Statement 
of Applicability). This will be mitigated for future audits with new measure 4.


(apologies if this is a dumb question...)

Can Mozilla / the BRs / whatever enforce making this [ie who determines 
the scope of the audits] explicit so issues don't get missed because the 
CA/TSP/subCA/intermediates and/or auditor mistakenly believe some items 
don't apply? Could we standardize/require some of this "Statement of 
Applicability" stuff to be a superset of the BRs, applicable RFCs, etc. ?


Or is that going to be useless either because whatever requirements on 
audits/auditors that Mozilla / the BRs would suggest get "trumped" by 
ETSI or other rules we can't (directly) influence, or because there are 
so many possible permutations of applicability/scope that trying to 
specify them in some way defeats the point, in that it would cause more 
rather than less confusion?


(just trying to figure out if there is some way we can avoid a 
reoccurrence of confusion with other issuers and/or auditors)


~ Gijs
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy