Re: [FORGED] Firefox removes UI for site identity

2019-10-29 Thread James Burton via dev-security-policy
Hi Paul,

I take the view that the articles on the CA Security Council website are a
form of marketing gimmick with no value whatsoever.

Thank you

Burton

On Tue, Oct 29, 2019 at 5:55 PM Paul Walsh via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Hi Nick,
>
> > On Oct 29, 2019, at 7:07 AM, Nick Lamb  wrote:
> >
> > On Mon, 28 Oct 2019 16:19:30 -0700
> > Paul Walsh via dev-security-policy
> >  wrote:
> >> If you believe the visual indicator has little or no value why did
> >> you add it?
> >
> > The EV indication dates back to the creation of Extended Validation,
> > and so the CA/Browser forum, which is well over a decade ago now.
> >
> > But it inherits its nature as a positive indicator from the SSL
> > padlock, which dates back to the mid-1990s when Netscape developed SSL.
> > At the time there was not yet a clear understanding that negative
> > indicators were the Right Thing™, and because Tim's toy hypermedia
> > system didn't have much security built in there was a lot of work to
> > do to get from there to here.
> >
> > Plenty of other bad ideas date back to the 1990s, such as PGP's "Web of
> > Trust". I doubt that Wayne can or should answer for bad ideas just
> > because he's now working on good ideas.
>
> [PW] I agree with your conclusion. But you’re commenting on the wrong
> thing. You snipped my message so much that my comment above is without
> context. You snipped it in a way that a reader will think I’m asking about
> the old visual indicators for identity - I’m not. I asked Wayne if he
> thinks the new Firefox visual indicator for tracking is unnecessary.
>
> I don’t want to labour my points any more. Those who disagree and took the
> time to comment, aren’t willing to exchange meaningful, constructive,
> respectful counter arguments. Those who disagree but aren’t commenting, may
> or may not care at all. And those who agree mostly show their support in
> private. I feel like this conversation is sucking up all the oxygen as a
> result.
>
> If we are all doing such a great job, attacks wouldn’t be on the rise and
> phishing wouldn’t be the number 1 problem. And we all know phishing is
> where a user falls for a deceptive website.
>
> One last time, here’s the article I wrote with many data points
> https://casecurity.org/2019/10/10/the-insecure-elephant-in-the-room/ <
> https://casecurity.org/2019/10/10/the-insecure-elephant-in-the-room/>
>
> I’m going to edit this article for Hackernoon, to include additional
> context about my support *for*encryption, https, padlock and free DV certs.
> I support them all, obviously. But some people assume I don’t support these
> critical elements because I pointed out the negative impact that their
> implementation is having.
>
> Thanks,
> - Paul
>
> >
> > Nick.
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: [FORGED] Firefox removes UI for site identity

2019-10-28 Thread James Burton via dev-security-policy
>
> [PW] Phil knows more about the intent so I’ll defer to his response at the
> end of this thread. I would like to add that computer screens bigger than
> mobile devices aren’t going away. So focusing only on mobile isn’t a good
> idea.
>
> Thanks for the constructive conversation James, finally :) But I don’t
> necessarily agree with your assertion about there being a lack of room to
> support identity. It all comes down to priority as you know. We could have
> said that Firefox mobile didn’t have enough room for tracking
> icons/settings before it was implemented - but because Mozilla feels this
> is important, they made the room. They made assertions about the lack of
> real estate for identity prior to implementing visual indicators for
> tracking.
>
> Mozilla once asserted that it wouldn’t implement any filtering
> tools/preferences for any reason because it was considered “censorship”.
> They have clearly changed their position - thankfully, with the filters for
> trackers/ads.
>
> Mozilla dropped its mobile browser strategy completely for a long period
> of time, but the team is now focused on mobile again. So things do change
> with time and realization of market conditions and mistakes. Everyone makes
> mistakes.
>
>
>> It's right that we are removing the extended validation style visual
>> security indicator from browsers because of a) the above statement b)
>>
>
> One could argue that there’s less room inside an app WebView - where
> there's so much inconsistency it hurts my head. Here’s an example of a
> design implementation that *might* work to help demonstrate my point about
> there being enough room - it’s not ideal but I only spent 5 minutes on it.
> [1]
>

I took a look at your concept of an extended validation type visual
security indicator and the conclusion is that it doesn't provide any
assurance to the users that the website is vetted or trustworthy. This
concept is similar to the padlock visual security indicator and that too
doesn't provide any assurance to the users that the website is vetted or
trustworthy. The padlock visual security indicator only provides the user a
visual indication that the connection is encrypted.

Read Emily Stark's Twitter response regarding Chrome and the removal of the
padlock visual security indicator:
https://twitter.com/estark37/status/1183769863841386496?s=20


> normal users don't understand extended validation style visual security
>> indicators c)
>>
>
> Because they were never educated properly - UX sucked more than anything.
> But you don’t just remove something without iterating to achieve
> product/market fit. That’s what happened with identity.
>

Users shouldn't have to go through education lessons to recognise different
positive visual security indicators. Its a stupid idea.

Next stupid idea will be expecting users to go through a compulsory exam to
learn about the different positive visual security indicators.
If failed, they can't purchase goods online. If passed, they get a license
issued to allow them to purchase goods online.

Browsers iterating positive visual security indicators to achieve
product/market fit is another stupid idea. It's good for CAs profit
margins. It's bad for users as it will totally confuse them. Even if we did
go down this stupid path, how many times would browsers need to change the
visual security indicators to suit the CAs product?


> the inconsistencies of extended validation style visual security indicator
>> between browsers d) users can't tell who is real or not based on extended
>> validation style visual security indicators as company names sometimes
>> don't match the actual site name.
>>
>
> I agree. This is why they should have been improved instead of removed.
> Mozilla will likely iterated the UI/UX around tracking to improve adoption.
>

Above. Stupid idea.


>
> Ian, like every other commentator I’ve read on this subject, say things
> that I agree with. But their conclusions and proposals are completely
> flawed in my opinion. As I’ve said before, you don’t just remove something
> that doesn’t see major adoption - you iterate/test. You’d only remove UI if
> you know for sure that it can’t be improved - there’s no data to suggest
> that any research was done around this. Mozilla have only supplied links to
> research that’s flawed and so old it’s useless. I’m blown away by their
> referencing research from more than 10 years ago. Some amazing people on
> this list weren’t even working with web tech back then.
>

Extended validation isn't a new concept and it has been proven it has
failed.


>
>
>
>> [1]  https://www.typewritten.net/writer/ev-phishing
>> [2]  https://stripe.ian.sh
>>
>
>
>
> [PW] [1] https://imgur.com/Va4heuo
>
> - Paul
>
>
>
>
> The original proposal that led to EV was actually to validate the company
> logos and present them as logotype.
> There was a ballot proposed here to bar any attempt to even experiment
> with logotype. This was withdrawn after I pointed out to Mozilla 

Re: [FORGED] Re: Firefox removes UI for site identity

2019-10-25 Thread James Burton via dev-security-policy
Extended validation was introduced at a time when mostly everyone browsed
the internet using low/medium resolution large screen devices that provided
the room for an extended validation style visual security indicator .
Everything has moved on and purchases are made on small screen devices that
has no room to support an extended validation style visual security
indicator. Apple supported  extended validation style visual security
indicator in iOS browser and it failed [1] [2].

It's right that we are removing the extended validation style visual
security indicator from browsers because of a) the above statement b)
normal users don't understand extended validation style visual security
indicators c) the inconsistencies of extended validation style visual
security indicator between browsers d) users can't tell who is real or not
based on extended validation style visual security indicators as company
names sometimes don't match the actual site name.

[1]  https://www.typewritten.net/writer/ev-phishing
[2]  https://stripe.ian.sh

Thank you

Burton

On Fri, Oct 25, 2019 at 5:35 AM Phillip Hallam-Baker via
dev-security-policy  wrote:

> On Thu, Oct 24, 2019 at 9:54 PM Peter Gutmann via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
> > Paul Walsh via dev-security-policy <
> dev-security-policy@lists.mozilla.org>
> > writes:
> >
> > >we conducted the same research with 85,000 active users over a period of
> > >12 months
> >
> > As I've already pointed out weeks ago when you first raised this, your
> > marketing department conducted a survey of EV marketing effectiveness.
> If
> > you have a refereed, peer-reviewed study published at a conference or in
> > an academic journal, please reference it, not a marketing survey
> > masquerading as a "study".
>
>
> There are certainly problems with doing usability research. But right now
> there is very little funding for academic studies that are worth reading.
>
> You didn't criticize the paper with 27 subjects split into three groups
> from 2007. Nor did you criticize the fact that the conclusions were totally
> misrepresented.
>
> So it doesn't appear to be spurious research that you have a problem with
> or the misrepresentation of the results. What you seem to have a problem
> with is the conclusions.
>
> At least with 85,000 subjects there is some chance that Paul himself has
> found out something of interest. That doesn't mean that we have to accept
> his conclusions as correct, or incontrovertible but I think it does mean
> that he deserves to be treated with respect.
> I am not at all happy with the way this discussion has gone. It seems that
> contrary to the claims of openness, Mozilla has a group think problem. For
> some reason it is entirely acceptable to attack CAs for any reason and with
> the flimsiest of evidence.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-30 Thread James Burton via dev-security-policy
Kirk,

I know you are really passionate about extended validation and it does
come across in your correspondences on this forum and the CAB Forum
but sometimes our passion or frustration leads us to divulge private
information which shouldn't have been released into the public domain.
Before you post any other correspondences to this forum or any other
forum, give it a check over or leave it until you are in a better
frame of mind. You don't want to accidentally break the NDA agreements
you have signed over the course of your work.

Thank you,

Burton

On Fri, Aug 30, 2019 at 8:45 PM Kirk Hall via dev-security-policy
 wrote:
>
> On Friday, August 30, 2019 at 11:38:55 AM UTC-7, Peter Bowen wrote:
> > On Fri, Aug 30, 2019 at 10:22 AM Kirk Hall via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> >
> > > I'll just reiterate my point and then drop the subject.  EV certificate
> > > subject information is used by anti-phishing services and browser phishing
> > > filters, and it would be a loss to the security ecosystem if this EV data
> > > disappears (meaning that the decision on removal of the EV UI has greater
> > > repercussions than just whether or not users can tell in the primary UI if
> > > their website does or does not have any confirmed identity information).
> > >
> >
> > Kirk,
> >
> > I have to admit that the first time I ever heard of browser phishing
> > filters and Internet security products (such as Trend Micro, Norton,
> > Mcafee, etc) differentiating between DV and EV SSL certificates as part of
> > their algorithm is in this thread, from you.  As someone who has a website,
> > I would really appreciate it if you could point to where this is
> > documented.  This morning I looked at a couple of network security vendor
> > products I've used and couldn't find any indication they differentiate, but
> > if there are ones that do it would certainly influence my personal decision
> > on the kind of certificates to use and to recommend others to use.
> >
> > I'm not personally aware of anyone doing this.  Are you aware of any
> > product literature that discusses this?
> >
> > Thanks,
> > Peter
>
> I have some emails out asking permission to share information that was given 
> to us in the past.  If I receive permission, I will post something further.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-29 Thread James Burton via dev-security-policy
These so called "extended" validation vetting checks on companies for
extended validation certificates are supposed to provide the consumer
on the website with an high level of assurance that the company has
been properly validated but the fact is that these so called
"extended" validation vetting checks are nothing more than basic
checks. The Disclosure and Barring Service (DBS) in the United Kingdom
conducts more vetting checks on an individual applying for an basic
DBS check than CAs do for an so called "extended" validation
certificates for companies.

I serious doubts over the methods used by CAs to conduct these so
called "extended" validation vetting checks. This is from personal
experience of going through dozens of dozens of validation checks of
all types of certificates with different CAs.

These so called "extended" validation certificates should be removed
forthwith because it is not performing the intended job it was
supposed to be made for and given that these so called "extended"
validation certificates are nothing more than basic checks it is in a
way falsely advertising to consumers on these websites that uses these
so called "extended" validation certificates that they have been
validated to an "extended" level of vetting which they have not.

Burton

On Thu, Aug 29, 2019 at 8:17 PM Ryan Sleevi via dev-security-policy
 wrote:
>
> On Thu, Aug 29, 2019 at 2:49 PM Kirk Hall via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
> > Sure, I’m happy to explain, using Bank of America as an example.
>
>
> Kirk,
>
> Thanks for providing this example. Could you help me understand how it
> helps determine that things are safe? For example, the reputation system
> you described, which is more akin to code signing than what is generally
> practiced an anti-phishing, seems like if it was implemented, it would
> leave users at significant risk from compromise on EV sites. That is, if an
> EV-using site was compromised and displayed a phishing form, the fact that
> it had "good" reputation would actually be actively harmful to users
> security, because it would make it harder to provide timely responsiveness.
> That is, it would be a false negative.
>
> In this case, the use of EV certificates, and the presumption of
> reputation, would lead to actively worse security.
>
> Did I misunderstand the scenario?
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-27 Thread James Burton via dev-security-policy
Resend again to fix spelling errors and add extra details

The correct way to vet a UK company would be to:
1. The CA checks Companies House to check if the company is incorporated.
2. The CA sends a letter with verification code to the company address
listed on Companies House.
3. The CA requests the company to send them a bank statement / tax bill to
prove operation.
4. Once the company has sent the information provided in (3) and is it
confirmed by CA then (5) otherwise (3).
5. Once the company receives the letter with verification code then (6)
otherwise vetting on hold until company receives letter with verification
code.
6. The CA initiates video call with the director who is listed at Companies
House and then the CA does the following:
a) The CA asks the director to hold up his/her passport to the side of
their face to confirm that the face matches the passport photo and then the
CA confirms the details such as full name and DOB matches the information
printed on the passport.
(optionally: For high profile companies the CA could also ask for the
passport to be countersigned and put the video call on hold while they
confirm the countersignature with the individual.)
(optionally: For high profile companies the CA could also ask for the
incorporation certificate to be countersigned and put the video call on
hold while they confirm the countersignature with the individual.)
b) The CA asks the director to hold up the verification letter in front of
the camera to confirm company address.
c) The CA calls the company number listed on 3rd party register (automated
phone call like the Google verification phone call) and the director tell
that verification code to the CA to confirm the phone number belongs to the
company.
d) Signs the agreement.
7. That's it.


This method confirms that:
a) The company is in active operation at the address listed on Companies
House.
b) The company phone number is in active operation at the company.
c) The director of the company has been vetted and confirmed certificate
request by signing the agreement.

Burton

On Tue, Aug 27, 2019 at 10:44 AM James Burton  wrote:

> Companies House (
> http://resources.companieshouse.gov.uk/serviceInformation.shtml#compInfo)
> says "We carry out basic checks on documents received to make sure that
> they have been fully completed and signed, but we do not have the statutory
> power or capability to verify the accuracy of the information that
> companies send to us. The fact that the information has been placed on the
> public record should not be taken to indicate that Companies House has
> verified or validated it in any way."
>
> Dun and Bradstreet takes a copy of this information without any
> verification checking and can be easily updated here
> https://www.dnb.co.uk/utility-pages/data-update.html again without any
> verification checks.
>
> When a CA is vetting an organization for an EV certificate, what
> information is actually vetted? Does the organization operate at that
> address? Am I actually speaking to the director James Burton in the
> verification call?
>
> The correct way to vet a UK company would be to:
> 1. The CA checks Companies House to check if the company is incorporated.
> 2. The CA sends a letter with verification code to the company address
> listed on Companies House.
> 3. The CA requests the company to send them a bank statement / tax bill to
> prove operation.
> 4. Once the company has sent the information provided in (3) and is it
> confirmed by CA then (5) otherwise (3).
> 5. Once the company receives the letter with verification code then (6)
> otherwise vetting on hold until company receives letter with verification
> code.
> 6. The CA initiates video call with the director at the listed at
> Companies House and then the CA does the following:
> a) Asks the director to hold up his/her passport to the side of their face
> to confirm that the face matches the passport photo and then the CA
> confirms the details such full name and DOB matches the information printed
> on the passport.
> b) The CA asks the director to hold up the verification letter in front of
> the camera to confirm company address.
> c) The CA then calls the company number listed on 3rd party register
> (automated phone call like Google verification phone call) and the director
> tell that verification code to the CA to confirm the phone number is
> belongs to the company.
> d) Signs the agreement.
> 7. That's it.
>
> On Tue, Aug 27, 2019 at 9:21 AM Jakob Bohm via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>> On 27/08/2019 08:03, Peter Gutmann wrote:
>> > Jakob Bohm via dev-security-policy <
>> dev-security-policy@lists.mozilla.org> writes:
>> >
>> >>  and
>> >>  both took advantage of weaknesses in two
>> >> government registries
>> >
>> > They weren't "weaknesses in government registries", they were registries
>> > working as designed, 

Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-27 Thread James Burton via dev-security-policy
Companies House (
http://resources.companieshouse.gov.uk/serviceInformation.shtml#compInfo)
says "We carry out basic checks on documents received to make sure that
they have been fully completed and signed, but we do not have the statutory
power or capability to verify the accuracy of the information that
companies send to us. The fact that the information has been placed on the
public record should not be taken to indicate that Companies House has
verified or validated it in any way."

Dun and Bradstreet takes a copy of this information without any
verification checking and can be easily updated here
https://www.dnb.co.uk/utility-pages/data-update.html again without any
verification checks.

When a CA is vetting an organization for an EV certificate, what
information is actually vetted? Does the organization operate at that
address? Am I actually speaking to the director James Burton in the
verification call?

The correct way to vet a UK company would be to:
1. The CA checks Companies House to check if the company is incorporated.
2. The CA sends a letter with verification code to the company address
listed on Companies House.
3. The CA requests the company to send them a bank statement / tax bill to
prove operation.
4. Once the company has sent the information provided in (3) and is it
confirmed by CA then (5) otherwise (3).
5. Once the company receives the letter with verification code then (6)
otherwise vetting on hold until company receives letter with verification
code.
6. The CA initiates video call with the director at the listed at Companies
House and then the CA does the following:
a) Asks the director to hold up his/her passport to the side of their face
to confirm that the face matches the passport photo and then the CA
confirms the details such full name and DOB matches the information printed
on the passport.
b) The CA asks the director to hold up the verification letter in front of
the camera to confirm company address.
c) The CA then calls the company number listed on 3rd party register
(automated phone call like Google verification phone call) and the director
tell that verification code to the CA to confirm the phone number is
belongs to the company.
d) Signs the agreement.
7. That's it.

On Tue, Aug 27, 2019 at 9:21 AM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 27/08/2019 08:03, Peter Gutmann wrote:
> > Jakob Bohm via dev-security-policy <
> dev-security-policy@lists.mozilla.org> writes:
> >
> >>  and
> >>  both took advantage of weaknesses in two
> >> government registries
> >
> > They weren't "weaknesses in government registries", they were registries
> > working as designed, and as intended.  The fact that they don't work in
> > they way EV wishes they did is a flaw in EV, not a problem with the
> > registries.
> >
>
> "Working as designed" doesn't mean "working as it should".
>
> The confusion that could be created online by getting EV certificates
> matching those company registrations were almost the same as those that
> could be created in the offline world by the registrations directly.
>
>
> >> Both demonstrations caused the researchers real name and identity to
> become
> >> part of the CA record, which was hand waved away by claiming that could
> >> have been avoided by criminal means.
> >
> > It wasn't "wished away", it's avoided without too much trouble by
> criminals,
> > see my earlier screenshot of just one of numerous black-market sites
> where
> > you can buy fraudulent EV certs from registered companies.  Again, EV may
> > wish this wasn't the case, but that's not how the real world works.
> >
>
> The screenshots you showed were for code signing EV certificates, not
> TLS EV certificates.  They seem related to a report a few years ago that
> spurned work to check the veracity of those screenshots and create
> appropriate countermeasures.
>
> >> 12 years old study involving en equally outdated browser.
> >
> > So you've published a more recent peer-reviewed academic study that
> > refutes the earlier work?  Could you send us the reference?
> >
>
> These two studies are outdated because they study the effects in a
> different overall situation (they were both made when the TLS EV concept
> had not yet been globally deployed).  They are thus based on entirely
> different facts (measured and unmeasured) than the situation in 2019.
>
> Very early in this thread someone quoted from a very recent study
> published at usenix, comparing the prevalence of malicious sites with
> different types of certificates.  The only response was platitudes,
> such as a emphasizing a small number being nonzero.
>
> Someone is trying very hard to create a fait acompli without going
> through proper debate and voting in relevant organizations such as
> the CAB/F.  So when challenged they play very dirty, using every
> rhetorical trick they can find to overpower criticism of 

Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-26 Thread James Burton via dev-security-policy
Jakob,

Before I touch on your comments, I wanted to point out that I am fairly
well known in the CA industry even back then and that fact might have
tainted the results sightly because I am treated some what differently to
other orders as the validation staff look more carefully at the information
presented in the order. If the order came from an anonymous body then the
chance of the success of the order is very high because the validation
staff would process it as normal without higher level intervention which
mostly happens with my orders.

The reasons why I chose Symantec:
a) They were the largest and most popular CA at that present time and most
of the largest companies in the world used that CA.
b) Symantec also provided a 30 day risk-free trial of their EV SSL and my
thinking was at the time that criminals would take advantage of that fact.

I did try Comodo first time and it did fail, the second time I tried Comodo
it worked. I publish it here:
https://www.typewritten.net/writer/ev-phishing-final/

Thank you

Burton

On Mon, Aug 26, 2019 at 8:00 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 24/08/2019 05:55, Tom Ritter wrote:
> > On Fri, 23 Aug 2019 at 22:53, Daniel Marschall via dev-security-policy
> >  wrote:
> >>
> >> Am Freitag, 23. August 2019 00:50:35 UTC+2 schrieb Ronald Crane:
> >>> On 8/22/2019 1:43 PM, kirkhalloregon--- via dev-security-policy wrote:
> >>>
> >>> Whatever the merits of EV (and perhaps there are some -- I'm not
> >>> convinced either way) this data is negligible evidence of them. A DV
> >>> cert is sufficient for phishing, so there's no reason for a phisher to
> >>> obtain an EV cert, hence very few phishing sites use them, hence EV
> >>> sites are (at present) mostly not phishing sites.
> >>
> >> Can you proove that your assumption "very few phishing sites use EV
> (only) because DV is sufficient" is correct?
> >
> > As before, the first email in the thread references the studies
> performed.
>
> The (obviously outdated) studies quoted below were NOT referenced by the
> first message in this thread.  The first message only referenced two
> highly unpersuasive demonstrations of the mischief possible in
> controlled experiments.
>
>  and
>  both took advantage of weaknesses in two
> government registries to create actual dummy companies with misleading
> names, then trying to get EV certs for those (with mixed success, as at
> least some CAs rejected or revoked the certs despite the government
> failures).  At least the first of those demonstrations involved a no
> longer trusted CA (Symantec).  Both demonstrations caused the
> researchers real name and identity to become part of the CA record,
> which was hand waved away by claiming that could have been avoided by
> criminal means.
>
>
> Studies quoted by Tom Ritter on 24/08/2019:
>
> >
> > "By dividing these users into three groups, our controlled study
> > measured both the effect of extended validation certificates that
> > appear only at legitimate sites and the effect of reading a help file
> > about security features in Internet Explorer 7. Across all groups, we
> > found that picture-in-picture attacks showing a fake browser window
> > were as effective as the best other phishing technique, the homograph
> > attack. Extended validation did not help users identify either
> > attack."
> >
> > https://www.adambarth.com/papers/2007/jackson-simon-tan-barth.pdf
> >
>
> 12 years old study involving en equally outdated browser.
>
> > "Our results showed that the identity indicators used in the
> > unmodified FF3browser did not influence decision-making for the
> > participants in our study interms of user trust in a web site. These
> > new identity indicators were ineffectivebecause none of the
> > participants even noticed their existence."
> >
> >
> http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.543.2117=rep1=pdf
> >
>
> An undated(!) study involving highly outdated browsers.  No indication
> this was ever in a peer reviewed journal.
>
> > DV is sufficient. Why pay for something you don't need?
> >
>
> Unproven claim, especially by studies from before free DV without
> traceable credit card payments became the norm.
>
>
> Enjoy
>
> Jakob
> --
> Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
> Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
> This public discussion message is non-binding and may contain errors.
> WiseMo - Remote Service Management for PCs, Phones and Embedded
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-16 Thread James Burton via dev-security-policy
If one compares the first EV specification with the current EV
specification one will notice that the EV specification hasn't changed that
much during its lifetime. The issues presented during the last years though
research have been known about since the first adoption of the EV
specification. If CAs really cared about EV they would have tried and
improved it during the past 10+ years but nothing happened. If browsers
decided to keep EV what would change? Nothing at all.

There is no one point in discussing the removal of EV any further because
the EV specification had already died.

On Fri, Aug 16, 2019 at 11:19 PM Matthew Hardeman via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Honestly the issues, as I see them, are twofold:
>
> 1.  When I visit a site for the first time, how do I know I should expect
> an EV certificate?  I am conscientious about subsequent visits, especially
> financial industry sites.
>
> 2.  The browsers seem to have a bias toward the average user, that user
> literally being less ...smart/aware... than half of all of users.  EV is a
> feature that can only benefit people who are vigilant and know what to look
> for.  It seems dismissive of the more capable users, but I suppose that's
> their call.
>
> On Fri, Aug 16, 2019 at 5:15 PM Daniel Marschall via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
> > I have a few more comments/annotations:
> >
> > (1) Pro EV persons argue "Criminals have problems getting an EV
> > certificate, so most of them are using only DV certificates".
> >
> > Anti EV persons argue "Criminals just don't use EV certificates, because
> > they know that end users don't look at the EV indicator anyway".
> >
> > I assume, we do not know which of these two assumptions fits to the
> > majority of criminals. So why should we make a decision (change of UI)
> > based on such assumptions?
> >
> > (2) I am a pro EV person, and I do not have any financial benefit from EV
> > certificates. I do not own EV certificates, instead my own websites use
> > Let's Encrypt DV certificates. But when I visit important pages like
> Google
> > or PayPal, I do look at the EV indicator bar, because I know that these
> > pages always have an EV certificate. If I would visit PayPal and only
> see a
> > normal pad lock (DV), then I would instantly leave the page because I
> know
> > that PayPal always has an EV certificate. So, at least for me, the UI
> > change is very negative (except if you color the pad lock in a different
> > color, that would be OK for me). We cannot say that all users don't care
> > about the EV indicator. For some users like me, it is important.
> >
> > (3) Also, I wanted to ask, if you want to remove the UI indicator,
> because
> > you think that EV certificates give the feeling of false security, then
> > please tell me: What is the alternative? Removing the UI bling without
> > giving any alternative solution is just wrong in my opinion. Yes, there
> > might be a tiny amount of phishing sites that use EV certificates, but
> the
> > EV indicator bar is still better than just nothing. AntiPhishing filters
> > are not a good alternative because they only protect when the harm is
> > already done to some users.
> > ___
> > dev-security-policy mailing list
> > dev-security-policy@lists.mozilla.org
> > https://lists.mozilla.org/listinfo/dev-security-policy
> >
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Fwd: Intent to Ship: Move Extended Validation Information out of the URL bar

2019-08-15 Thread James Burton via dev-security-policy
My understanding of the days before EV was that the CAs themselves made up
the validation requirements for DV and because of this there was an uneven
validation requirements across the industry. EV was the first document
created to solve this and standardise validation requirements for a
certificate type. Moving forward the baseline requirements has standardise
validation requirements for the DV certificate type and therefore EV is no
allowed needed.

Regarding the phishing aspect of EV, users have no clue what EV is and they
are more interested in looking for the padlock and completing the
checkout process.

On Thu, Aug 15, 2019 at 8:16 PM Ronald Crane via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 8/15/2019 10:58 AM, Doug Beattie via dev-security-policy wrote:
> > So far I see is a number of contrived test cases picking apart small
> components of EV, and no real data to back it up.
> I also would like to see more evidence of problems. However, I have to
> object to the idea that
> > Mostly academic...research, imho...
> is of little value. This treads dangerously close to nihilism.
> > https://stripe.ian.sh/: EV certificates with colliding entity names can
> be generated, but to date, I don’t know of any real attacks, just this
> academic exercise. And how much did it cost and how long did it Ian to get
> certificates to perform this experiment?  Way more time and money that a
> phisher would invest.
> I question that a phisher, who stands potentially to gain hundreds of
> thousands or millions of dollars by phishing, e.g., the customers of a
> major bank, would not, as this paper says, invest "48 hours from
> incorporation to the issuance of the certificate" and "$177". This is a
> trivial investment for a non-frivolous financial phisher, let alone,
> say, a foreign government interested in phishing, say, a
> voter-registration (or -- shudder! -- an e-voting) site.
> > Yes, I work for a CA that issues EV certificates, but if there was no
> value in them, then our customers would certainly not be paying extra for
> them.
> That your customers may perceive additional value in them doesn't mean
> that they provide additional value to the general internet user. That
> said, I lean toward Mozilla letting this debate settle out before hiding
> EV support in release Firefox.
>
> -R
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Open Source CA Software

2019-03-14 Thread James Burton via dev-security-policy
(Forgot to post it to m.d.s.p)

Your right that we all failed to conduct the proper due diligence source
code checks on EJBCA and therefore missed this important issue. We all need
to learn from this past mistake and implement better checks which prevents
issues like this that might arise in the future.

Thank you,

Burton

On Thu, Mar 14, 2019 at 10:57 PM Ryan Sleevi  wrote:

>
>
> On Thu, Mar 14, 2019 at 6:54 PM James Burton via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>> Let's Encrypt CA software 'Boulder' is open source for everyone to browse
>> and check for issues. All other CAs should follow the Let's Encrypt lead
>> and open source their own CA software for everyone to browse and check for
>> issues. We might have found the serial number issue sooner.
>>
>
> Considering EJBCA is open-source, this does not seem that it would
> logically follow.
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Open Source CA Software

2019-03-14 Thread James Burton via dev-security-policy
Let's Encrypt CA software 'Boulder' is open source for everyone to browse
and check for issues. All other CAs should follow the Let's Encrypt lead
and open source their own CA software for everyone to browse and check for
issues. We might have found the serial number issue sooner.

Thank you,

Burton
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Initial Incident Report: Issuance of certificates with 63 bit serial number

2019-03-10 Thread James Burton via dev-security-policy
Hi Fotis,

You need to file this as a bugzilla bug.

Thank you,

Burton

On Sun, 10 Mar 2019 at 18:34, Fotis Loukos via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> SSL.com has been following the recent discussions at
> mozilla.dev.security.policy regarding the behavior of EJBCA based CAs in
> the matter of serial number generation.
>
> SSL.com is using EJBCA internally and is affected by the same issue.
> After consulting with our auditors, we would like to post a preliminary
> report of our findings.
>
> ### How your CA first became aware of the problem (e.g. via a problem
> report submitted to your Problem Reporting Mechanism, a discussion in
> mozilla.dev.security.policy, a Bugzilla bug, or internal self-audit),
> and the time and date.
>
> SSL.com has been following discussion of this issue in
> mozilla.dev.security.policy and initiated a review on 05/03/2019.
>
> ### A timeline of the actions your CA took in response. A timeline is a
> date-and-time-stamped sequence of all relevant events. This may include
> events before the incident was reported, such as when a particular
> requirement became applicable, or a document changed, or a bug was
> introduced, or an audit was done.
>
> - 10/07/2016 - Ballot 164 on Certificate Serial Number Entropy is voted.
> - 30/09/2016 - Ballot 164 enters into effect.
> - 05/03/2019 - Initial review initiated.
> - 05/03/2019 - Confirmed issue exists in SSL.com certificates.
> - 05/03/2019 - Tested and deployed correction to production systems.
> - 06/03/2019 - A full plan for the revocation of all certificates has
> been initiated. We plan on revoking all certificates within 30 days.
>
> ### Whether your CA has stopped, or has not yet stopped, issuing
> certificates with the problem. A statement that you have will be
> considered a pledge to the community; a statement that you have not
> requires an explanation.
>
> A patch was deployed as soon as we confirmed that the issue exists in
> SSL.com certificates. Certificate issuance has been resumed with serials
> meeting all requirements.
>
> ### A summary of the problematic certificates. For each problem: number
> of certs, and the date the first and last certs with that problem were
> issued.
>
> - 6931 End Entity TLS Certificates
>
> We will post an update that will include all S/MIME and CA certificate
> information.
>
> ### The complete certificate data for the problematic certificates. The
> recommended way to provide this is to ensure each certificate is logged
> to CT and then list the fingerprints or crt.sh IDs, either in the report
> or as an attached spreadsheet, with one list per distinct problem.
>
> Please find attached the file tlseelist.txt containing the fingerprints
> of all affected end entity TLS certificates. S/MIME and CA certificate
> information will be posted during an update.
>
> ### Explanation about how and why the mistakes were made or bugs
> introduced, and how they avoided detection until now.
>
> EJBCA's method of generating serial numbers has led to a discrepancy
> between expected and actual behavior and output, such that any CA using
> EJBCA with the default settings will encounter this issue (and be
> therefore in violation of BR 7.1).
>
> ### List of steps your CA is taking to resolve the situation and ensure
> such issuance will not be repeated in the future, accompanied with a
> timeline of when your CA expects to accomplish these things.
>
> The number of bits of entropy used for the generation of serial numbers
> has been increased to 127.
>
> In addition to remediation related to this issue, we will initiate a
> review of other technical requirements of CA/B Forum and how they are
> implemented by EJBCA in order to ensure no more problematic practices
> are followed.
>
> SSL.com intends to exceed minimum technical requirements where ever
> possible to guard against similar issues in the future and ensure the
> highest possible level of security and compliance.
>
> Regards,
> Fotis
>
> --
> Fotis Loukos, PhD
> Director of Security Architecture
> SSL Corp
> e: fot...@ssl.com
> w: https://www.ssl.com
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: EJBCA defaulting to 63 bit serial numbers

2019-03-09 Thread James Burton via dev-security-policy
What concerns me overall in this discussion is the fact that some CAs
thought it was completely acceptable to barely scrape through to meet the
most basic minimum of requirements. I hope these CAs have a better security
posture and are not operating at the minimum.

Thank you,

Burton

On Sat, Mar 9, 2019 at 8:24 PM Ryan Sleevi via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Sat, Mar 9, 2019 at 2:49 PM Dimitris Zacharopoulos 
> wrote:
>
> > The question I'm having trouble answering, and I would appreciate if this
> > was answered by the Mozilla CA Certificate Policy Module Owner, is
> >
> > "does Mozilla treat this finding as a violation of the current language
> of
> > section 7.1 of the CA/B Forum Baseline Requirements"?
> >
>
> I think for Mozilla, this is best answered by Kathleen, Wayne, the Mozilla
> CA Policy Peers, and which I am not.
>
> On behalf of Google and the Chrome Root Authority Program, and consistent
> with past discussion in the CA/Browser Forum regarding expectations [1], we
> do view this as a violation of the Baseline Requirements. As such, the
> providing of incident reports, and the engagement with public discussion of
> them, represents the most transparent and acceptable course of action.
>
> Historically, we have found that the concerns around incident reporting
> have been best addressed through a single, unified, and transparent
> engagement in the community. Much as ct-pol...@chromium.org has happily
> and
> intentionally supported collaboration from counterparts at Mozilla and
> Apple, Mozilla has historically graciously allowed  for the unified
> discussion on this mailing list, and the use of their bugtracker for the
> purpose of engaging publicly and transparently on incident reports that
> affect the Web PKI. Should Mozilla have a different interpretation of the
> Baseline Requirements’ expectations on this, we’d seek guidance as to
> whether or not the bug tracker and mailing list continue to represent the
> best place for discussion of this specific issue, although note that
> historically, this has been the case.
>
> This should make it clear that CAs which extracted 64 bits of entropy as an
> input to an algorithm that then set the sign bit to positive and
> potentially decreasing the entropy to 63 bits, as opposed to
> unconditionally guaranteeing that there was a positive integer with _at
> least_ 64 bits of entropy, are non-compliant with the BRs and program
> expectations, and should file incident reports and include such disclosures
> in their reporting by and assertions to auditors.
>
> [1]
> https://cabforum.org/pipermail/public/2016-April/007245.html
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: A modest proposal for a better BR 7.1

2019-03-09 Thread James Burton via dev-security-policy
Matt's right, you need to discussion this on the CAB Forum.

Burton

On Sat, Mar 9, 2019 at 9:10 AM Matt Palmer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Fri, Mar 08, 2019 at 08:43:49PM -0600, Matthew Hardeman via
> dev-security-policy wrote:
> > I know this isn't the place to bring a BR ballot, but I'm not presently a
> > participant there.
>
> My understanding is that discussing potential BR changes here is actively
> counter-productive, because of intellectual property concerns around the
> BRs.  Basically, if you want to propose a change in the BRs, you really
> *have* to sign up as an interested party and do the IPR agreement dance,
> otherwise... problems.  And no, disclaiming copyrights, etc probably isn't
> enough, because it's far more complicated than that.
>
> - Matt
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-03-07 Thread James Burton via dev-security-policy
I'm talking about someone from a restricted country using a undocumented
domain name to obtain a Let's Encrypt certificate and there is nothing that
can be done about it. We can't predict the future.

Thank you,

Burton

On Thu, Mar 7, 2019 at 5:23 PM Matthew Hardeman  wrote:

>
> On Thu, Mar 7, 2019 at 11:11 AM James Burton  wrote:
>
>> Let's be realistic, anyone can obtain a domain validated certificate from
>> Let's Encrypt and there is nothing really we can do to prevent this from
>> happening. Methods exist.
>>
>
> I am continuing to engage in this tangent only in as far as it illustrates
> the kinds of geopolitical issues that already taint this space and in as
> much as that, I believe has some relevance for the larger conversation.
> Now that I've said that, please, by all means, if I'm wrong about the
> referenced assertion that I've posted, reach out to the usareally.com
> people and help them get a Let's Encrypt certificate.  Good luck with that.
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-03-07 Thread James Burton via dev-security-policy
Let's be realistic, anyone can obtain a domain validated certificate from
Let's Encrypt and there is nothing really we can do to prevent this from
happening. Methods exist.

Thank you,

Burton

On Thu, Mar 7, 2019 at 4:59 PM Matthew Hardeman  wrote:

>
> On Thu, Mar 7, 2019 at 10:54 AM James Burton  wrote:
>
>> Let's Encrypt issues domain validation certificates and anyone with a
>> suitable domain name (e.g. .com, .net, .org  ) can get one of these
>> certificates just by proving control over the domain by using the DNS or "
>> /.well-known/pki-validation" directory as stated in the CAB Forum baseline
>> requirements. Country location doesn't matter.
>>
>
> I'm sorry, but that is inaccurate.  There are literally banned
> subscribers.  Let's Encrypt has publicly and officially acknowledged
> this[1].
>
> [1]
> https://community.letsencrypt.org/t/according-to-mcclatchydc-com-lets-encrypt-revoqued-and-banned-usareally-com/81517/10?u=mdhardeman
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-03-07 Thread James Burton via dev-security-policy
I mean country location of the individual doesn't matter. They could be for
example be using a VPN to connect to Google Cloud instance and get a
certificate that way.

Thank you,

Burton

On Thu, Mar 7, 2019 at 4:53 PM James Burton  wrote:

> Let's Encrypt issues domain validation certificates and anyone with a
> suitable domain name (e.g. .com, .net, .org  ) can get one of these
> certificates just by proving control over the domain by using the DNS or "
> /.well-known/pki-validation" directory as stated in the CAB Forum baseline
> requirements. Country location doesn't matter.
>
> Thank you
>
> Burton
>
>
> On Thu, Mar 7, 2019 at 4:29 PM Matthew Hardeman 
> wrote:
>
>>
>>
>> On Thu, Mar 7, 2019 at 10:20 AM Matthew Hardeman 
>> wrote:
>>
>>>
>>> Let's Encrypt does not quite provide certificates to everyone around the
>>> world.  They do prevent issuance to and revoke prior certificates for those
>>> on the United States various SDN (specially designated nationals) lists.
>>> For example, units of the Iraqi government or those acting at their behest
>>> may not receive Let's Encrypt certificates.
>>>
>>
>> Whoops!  I meant to say the Iranian government.
>>
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-03-07 Thread James Burton via dev-security-policy
Let's Encrypt issues domain validation certificates and anyone with a
suitable domain name (e.g. .com, .net, .org  ) can get one of these
certificates just by proving control over the domain by using the DNS or "
/.well-known/pki-validation" directory as stated in the CAB Forum baseline
requirements. Country location doesn't matter.

Thank you

Burton


On Thu, Mar 7, 2019 at 4:29 PM Matthew Hardeman  wrote:

>
>
> On Thu, Mar 7, 2019 at 10:20 AM Matthew Hardeman 
> wrote:
>
>>
>> Let's Encrypt does not quite provide certificates to everyone around the
>> world.  They do prevent issuance to and revoke prior certificates for those
>> on the United States various SDN (specially designated nationals) lists.
>> For example, units of the Iraqi government or those acting at their behest
>> may not receive Let's Encrypt certificates.
>>
>
> Whoops!  I meant to say the Iranian government.
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-03-07 Thread James Burton via dev-security-policy
Benjamin,

There is one theme in all of your responses and it's perfectly clear that
you feel strongly that this discussion as a whole is an attack not only on
DarkMatter's operations but on the United Arab Emirates sovereignty right
to able to have a root included in the Mozilla root store and use of a
non-constrained intermediate. You're constantly framing your responses to
discredit and attack well respected, fair and honest individuals by stating
that they are peddling hidden agenda against DarkMatter and United Arab
Emirates which is clearly false. There motives are to protect Mozilla users
around the world and to do this they are objectively looking at all of the
reports from multiple news organizations, previous and on going discussions
on here and other places to determent if DarkMatter's operations are truly
trustworthy to the highest degree. Remember, money can't
buy trustworthiness it must be earned by showing clearly the true face of
the operations within the organization. Next.

The CAB Forum current and previous ballots and discussions are public
knowledge and by stating that DarkMatter couldn't have known about these
discussions or ballots is porkies. What you are really saying to everyone
is that DarkMatter couldn't be bothered to search though the CAB Forum's
previous discussions and ballots which demonstrates an amateurish operation
at heart. Being a CA is a serious operation and as such they are expected
in the eyes to everyone that should know every policy, every current and
previous ballot, every rfc standard, etc which affect the CA operationally.
Next.

There isn't any monopoly that prevents citizens and organizations in the
United Arab Emirates to get certificates from CAs and they are not
expensive. Let's Encrypt provides free domain validated certificates to
everyone around the world. Next.

Thank you,

Burton

On Thu, Mar 7, 2019 at 9:54 AM Matt Palmer via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Thu, Mar 07, 2019 at 05:17:07AM +, Benjamin Gabriel via
> dev-security-policy wrote:
> > On Wednesday, March 6, 2019 7:51 PM, Ryan Sleevi wrote:>
> > >DarkMatter response to the serial number issue has demonstrated
> > >that DarkMatter did not do the expected due diligence to investigate
> > >and understand the issue.
> >
> > Your statement as Google's representative is quite disingenuous and
> > self-serving.  As a new member of the CABForum, we were not privy to the
> > discussions for Ballot 164, and have interpreted the Baseline
> Requirements
> > as they were written.
>
> I explained[1] how repeatedly asking an RNG for a 64-bit number that meets
> certain criteria is not 64 bits of output from said RNG.  Coming to that
> conclusion doesn't require a history lesson.
>
> Making the mistake isn't the real problem, though.  Mistakes happen.  It is
> how the mistake is responded to which is important.  DarkMatter's
> representative persisted in trying to pretend there wasn't a problem when
> there was.  That does not show the sort of openness to improvement which I,
> at least, would prefer to see in a globally-trusted CA.
>
> > >You have highlighted that you believe such articles are misleading,
> > > but there  are a number of unresponded questions to past replies
> > > that seek to better understand.
> >
> > I am glad that you brought this up directly with me - and in this public
> > discussion.  Ryan, you have been one of the individuals who have been
> > persistent in spreading this false narrative - as far back as February
> > 2018 - during our initial submission to CABForum.  We have duly noted and
> > have been aware of your persistent attempts to interfere with our
> > contractual relations.  Your employer should know that we have had to
> > expend considerable effort to defend against your back-room politicking,
> > and defamatory innuendos, about the nature of our business.
>
> I'm curious how you think that throwing around veiled threats of legal
> action against one of the more widely-respected members of this community
> is
> going to encourage people to trust your organisation *more*.
>
> - Matt
>
> [1]
> https://groups.google.com/d/msg/mozilla.dev.security.policy/nnLVNfqgz7g/c6HoK97RBQAJ
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DarkMatter Concerns

2019-03-07 Thread James Burton via dev-security-policy
Benjamin,

There is one theme in all of your responses and it's perfectly clear that
you feel strongly that this discussion as a whole is an attack not only on
DarkMatter's operations but on the United Arab Emirates sovereignty right
to able to have a root included in the Mozilla root store and use of a
non-constrained intermediate. You're constantly framing your responses to
discredit and attack well respected, fair and honest individuals by stating
that they are peddling hidden agenda against DarkMatter and United Arab
Emirates which is clearly false. There motives are to protect Mozilla users
around the world and to do this they are objectively looking at all of the
reports from multiple news organizations, previous and on going discussions
on here and other places to determent if DarkMatter's operations are truly
trustworthy to the highest degree. Remember, money can't
buy trustworthiness it must be earned by showing clearly the true face of
the operations within the organization. Next.

The CAB Forum current and previous ballots and discussions are public
knowledge and by stating that DarkMatter couldn't have known about these
discussions or ballots is porkies. What you are really saying to everyone
is that DarkMatter couldn't be bothered to search though the CAB Forum's
previous discussions and ballots which demonstrates an amateurish operation
at heart. Being a CA is a serious operation and as such they are expected
in the eyes to everyone that should know every policy, every current and
previous ballot, every rfc standard, etc which affect the CA operationally.
Next.

There isn't any monopoly that prevents citizens and organizations in the
United Arab Emirates to get certificates from CAs and they are not
expensive. Let's Encrypt provides free domain validated certificates to
everyone around the world. Next.

Thank you,

Burton



On Thu, Mar 7, 2019 at 8:09 AM Benjamin Gabriel via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Dear Ryan,
>
> A fair and transparent public discussion requires full disclosure of each
> participant's motivations and ultimate agenda.  Whether in CABForum, or
> Mozilla-dev-security-policy, I represent the viewpoints of my employer
> DarkMatter and passionately believe in our unflagging efforts to provide
> the citizens, residents and visitors to the United Arab Emirates with the
> same internet security and privacy protections that are taken for granted
> in other parts of the world.
>
> On Wednesday, March 6, 2019 7:51 PM, Ryan Sleevi wrote:
> >  (Writing in a personal capacity)
>
> Until such time as we have been formally advised by your employer
> (Google), that you no longer represent their views in CABForum, or in this
> Mozilla-dev-security-policy forum, we will proceed on the basis that all of
> your statements are the official viewpoint of your employer (Google).
>
> >   I highlight this, because given the inherently global nature of the
> >   Internet,  there is no technical need to work with local CAs, and,
> >   with a well-run root store,  all CAs provide an equivalent level of
> >   protection and security, which rests in the domain authorization
>
> We reject your paternalistic view that there is no technical need for a
> local United Arab Emirates CA.  Our own research has determined that
> approximately 68% of the websites in the United Arab Emirates are not
> adequately protected for HTTPS traffic (double the global average).  If
> those incumbent CA monopolies that you champion were doing such a great job
> globally - why such a stark difference?
>
> We are of the view that CA monopolies are inherently bad for the internet
> in that they unfairly exploit market power. The result is  a fundamental
> right to Internet security and privacy being deliberately priced out of
> reach for a significant population of the world.  We ask you, what can be
> more an anti-competitive monopoly than  a "well run store" (read
> Google/Mozilla) that does not take into consideration that sovereign
> nations have the fundamental right to provide digital services to their own
> citizens, utilizing their own national root, without being held hostage by
> a provider situated in another nation.  You should note that DarkMatter's
> request is also for the inclusion of UAE's national root.
>
> >DarkMatter response to the serial number issue has demonstrated
> >that DarkMatter did not do the expected due diligence to investigate
> >and understand the issue.
>
> Your statement as Google's representative is quite disingenuous and
> self-serving.   As a new member of the CABForum, we were not privy to the
> discussions for Ballot 164, and have interpreted the Baseline Requirements
> as they were written.   We have made the necessary incident report and
> corrections. [1]  We note that your own employer, Google, also discovered
> that it had the same entropy non-compliance with its serial numbers (as a
> result of the DarkMatter discussions 

Re: Underscore characters

2018-12-27 Thread James Burton via dev-security-policy
On Thu, Dec 27, 2018 at 9:00 PM Ryan Sleevi  wrote:

> I'm not really sure I understand this response at all. I'm hoping you can
> clarify.
>
> On Thu, Dec 27, 2018 at 3:45 PM James Burton  wrote:
>
>> For a CA to intentionally state that they are going to violate the BR
>> requirements means that that CA is under immense pressure to comply with
>> demands or face retribution.
>>
>
> I'm not sure I understand how this flows. Comply with whose demands? Face
> retribution from who, and why?
>

The CA must be under immense pressure to comply with demands from certain
customers to determine that they don't have much of a choice but to
intentionally violate the BR requirements and by telling community and root
stores early they are hoping for leniency. The retribution by them
customers could be legal which is outside of this forum but is but it's
still relevant to them if that is the case.


>
>> The severity inflicted on a CA by intentionally violating the BR
>> requirements can be severe. Rolling a dice of chance. Why take the risk?
>>
>
> I'm not sure I understand the question at the end, and suspect there's a
> point to the question I'm missing.
>

The CA is rolling the dice of chance, they are intentionally risking
everything by violating the BR requirements and they know that such action
can face sanctions or distrust in the wrong case. The question I asked is
why are they taking the risk which leads from the first statement.


> Presumably, a CA stating they're going to violate the BR requirements,
> knowing the risk to trust that it may pose, would have done everything
> possible to gather every piece of information so that they could assess the
> risk of violation is outweighed by whatever other risks (in this case,
> revocation). If that's the case, is it unreasonable to ask how the CA
> determined that - which is the root cause analysis question? And how to
> mitigate whatever other risk (in this case, revocation) poses going
> forward, so that violating the BRs isn't consistently seen as the "best"
> option?
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Underscore characters

2018-12-27 Thread James Burton via dev-security-policy
For a CA to intentionally state that they are going to violate the BR
requirements means that that CA is under immense pressure to comply with
demands or face retribution. The severity inflicted on a CA by
intentionally violating the BR requirements can be severe. Rolling a dice
of chance. Why take the risk?



On Thu, Dec 27, 2018 at 8:21 PM Ryan Sleevi via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> I'm not trying to throw you under the bus here, but I think it's helpful if
> you could highlight what new information you see being required, versus
> that which is already required.
>
> I think, yes, you're right that it's not well received if you go violate
> the BRs and then, after the fact, say "Hey, yeah, we violated, but here's
> why", and finding out that the reasons are met with a lot of skepticism and
> the math being shaky, and you can see that from past incident reports it
> doesn't go over well.
>
> But it's also not well received if it's before, and the statement is "Our
> customer thinks we should violate the BRs. What would happen if we did, and
> what information do you need from us?". That gets into the moral hazard
> that Matt spoke to, and is a huge burden on the community where the
> expectation is that the CA says "Sorry, we can't do that".
>
> So the assumption here is that, in all of this discussion, DigiCert's done
> everything it can to understand the issue, the timelines, remediation, etc,
> and has plans to address both each and every customer and the systemic
> issues that have emerged. If that's not the case, then how are we not in
> one of those two scenarios above? And if it is the case, isn't that
> information readily available by now?
>
> From the discussions on the incident reports, I feel like that's been the
> heart of the questions; which is trying to understand what the root cause
> is and what the remediation plan is. The statement "We'll miss the first
> deadline, but we'll hit the second", but without any details about how or
> why, or the steps being taken to ensure no deadlines are missed in the
> future, doesn't really inspire confidence, and is exactly the same kind of
> feedback that would be given post-incident.
>
> On Thu, Dec 27, 2018 at 1:50 PM Jeremy Rowley via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
> > There's a little bit of a "damned if you do, damned if you don't problem
> > here". Wait until you have all the information? That's a paddlin'.  File
> > before you have enough information? That's a paddlin'. I'd appreciate
> > better guidance on what Mozilla expects from these incident reports
> > timing-wise.
> >
> > -Original Message-
> > From: dev-security-policy  >
> > On Behalf Of Jeremy Rowley via dev-security-policy
> > Sent: Thursday, December 27, 2018 11:47 AM
> > To: r...@sleevi.com
> > Cc: dev-security-policy@lists.mozilla.org
> > Subject: RE: Underscore characters
> >
> > The original incident report contained all of the details of the initial
> > filing.  The additional, separated reports are trickling in as I get
> enough
> > info to post something in reply to the updated questions. As the
> questions
> > asked have changed from the original 7 in the Mozilla incident report,
> > getting the info back takes time. Especially during the holiday season.
> > We’re also working to close out as many without an exception as possible.
> > Note that the deadline has not passed yet so all of these incident
> reports
> > are theoretical (and not actually incidents) until Jan 15th. I gave the
> > community the total potential number of certificates impacted and the
> total
> > number of customers so we can have a community discussion on the overall
> > risk and get public comments into the process before the deadline passes.
> > I’m unaware of any policy at Mozilla or Google that provides guidance on
> > how to file expected issues before they happen. If there is, I’d gladly
> > follow that.
> >
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Use cases of publicly-trusted certificates

2018-12-27 Thread James Burton via dev-security-policy
The main reason that publicly trusted certificates are used by
organizations for all infrastructure (internal and external) is that it's
far cheaper than building and maintaining an internal PKI.

On Thu, Dec 27, 2018 at 4:14 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 27/12/2018 17:02, Rob Stradling wrote:
> > On 27/12/2018 15:38, Jakob Bohm via dev-security-policy wrote:
> > 
> >> For example, the relevant EKU is named "id-kp-serverAuth" not "id-kp-
> >> browserWwwServerAuth" .  WWW is mentioned only in a comment under the
> >> OID definition.
> >
> > Hi Jakob.
> >
> > Are you suggesting that comments in ASN.1 specifications are meaningless
> > or that they do not convey intent?
> >
> > Also, are you suggesting that a canonical OID name must clearly convey
> > the full and precise intent of the purpose(s) for which the OID should
> > be used?
> >
>
> In general no.  However in this special case, the comment is
> inconsistent with everything else.
>
> Enjoy
>
> Jakob
> --
> Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
> Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
> This public discussion message is non-binding and may contain errors.
> WiseMo - Remote Service Management for PCs, Phones and Embedded
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Google Trust Services Root Inclusion Request

2018-09-27 Thread James Burton via dev-security-policy
Richard,

Your conduct is totally unacceptable and won’t be tolerated. You must read
the forum rules regarding etiquette.

Also I suggest you apologise to Ryan.

James



On Thu, 27 Sep 2018 at 10:33, Rob Stradling via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Richard,
>
> You might like to familiarize yourself with the Mozilla Forum Etiquette
> Ground Rules:
> https://www.mozilla.org/en-US/about/forums/etiquette/
>
> Note this in particular:
> "Be civil.
> No personal attacks. Do not feel compelled to defend your honor in
> public. Posts containing personal attacks may be removed from the news
> server."
>
> On 27/09/2018 07:59, Richard Wang via dev-security-policy wrote:
> > Sorry, I don't agree with this point. Ryan Sleevi is the Mozilla Module
> Peer that gave too many pressures to the M.D.S.P community to misleading
> the Community and to let Mozilla make the decision that Google want.
> >
> > There are two facts to support my opinion:
> >
> > (1) For StartCom sanction, Mozilla agreed in Oct 2nd 2016 London meeting
> that if we separate StartCom completely from WoSign, then Mozilla don't
> sanction StartCom that still trust StartCom root. But Google as peer of
> Mozilla Module don't agree this, and Ryan even found many very very old
> problems of StartCom to be a "fact" that must be distrusted. Google changed
> the Mozilla decision!
> >
> > (2) For Symantec sanction, everyone can see the argues in M.D.S.P
> discussion from Ryan Sleevi that Google changed the Mozilla initial
> decision, this also is the fact.
> >
> > So, we can see Ryan not just a Mozilla Module Peer, he represents Google
> browser that affect Mozilla to make the right decision.
> >
> > Ryan, don't feel too good about yourself. Peoples patiently look at your
> long emails at M.D.S.P and listen to your bala bala speaking at the CABF
> meeting, this is because you represent Google Chrome, and Google Chrome
> seriously affects Mozilla that have the power to kill any CAs. If you leave
> Google, you will be nothing, no one will care about your existence, and no
> one will care what you say. So, please don't declare that you don't
> represent Google before you speak next time, nonsense!
> >
> > Your myopic has brought global Internet security to the ditch. Chrome
> display "Secure" for a website just it has SSL(https). Many fake banking
> websites and fake PayPal websites have Lets Encrypt certificates, and
> Google Chrome say it is "Secure", this completely misleads global Internet
> users, resulting in many users are deceived and lost property. Encryption
> is not equal to secure. Secure means not only encryption, but also need to
> tell user the website's true identity. Does a fake bank website encryption
> mean anything? nothing and more worse.
> >
> > Ryan, 别自我感觉太好,别人耐心看你在M.D.S.P的长篇大论和听你在CABF meeting上说过没完
> ,是因为你代表谷歌浏览器,而谷歌浏览器严重影响Mozilla对所有CA有生杀大权。如果你离开谷歌,你将什么也不是,没有人会理会你的存在,也没有人会在意你说的话。所以下次不要在发言之前就声明不代表谷歌,废话哦!
> >
> > 你的短视把全球互联网安全带到了沟里,认为有SSL证书(https)就安全,许多假冒银行网站、假冒PayPal 网站都有Lets
> Encrypt证书,谷歌浏览器显示为安全,完全误导了全球互联网用户,导致许多用户上当受骗和财产损失。已加密并不等于安全,安全不仅意味着需要加密,而且还需要告知用户此网站的真实身份,一个假冒银行网站加密有任何意义吗?没有并且更糟糕。
> >
> >
> > Best Regards,
> >
> > Richard Wang
> >
> >  Original Message 
> > From: Ryan Sleevi via dev-security-policy
> > Received: Thursday, 27 September 2018 00:53
> > To: Jeremy Rowley
> > Cc: Ryan Sleevi ; mozilla-dev-security-policy
> > Subject: Re: Google Trust Services Root Inclusion Request
> >
> >
> > On Wed, Sep 26, 2018 at 12:04 PM Jeremy Rowley <
> jeremy.row...@digicert.com>
> > wrote:
> >
> >> I also should also emphasize that I’m speaking as Jeremy Rowley, not as
> >> DigiCert.
> >>
> >>
> >>
> >> Note that I didn’t say Google controlled the policy. However, as a
> module
> >> peer, Google does have significant influence over the policy and what
> CAs
> >> are trusted by Mozilla. Although everyone can participate in Mozilla
> >> discussions publicly, it’s a fallacy to state that a general participant
> >> has similar sway or authority to a module peer. That’s the whole point
> of
> >> having a separate class for peers compared to us general public.  With
> >> Google acting as a CA and module peer, you now have one CA heavily
> >> influencing who its competitors are, how its competitors operate, and
> what
> >> its competitors can do.  Although I personally find that you never
> misuse
> >> your power as a module peer, I can see how Jake has concerns that Google
> >> (as a CA) has very heavy influence over the platform that has
> historically
> >> been the CA watchdog (Mozilla).
> >>
> >
> > Jeremy, I think this again deserves calling out, because this is
> > misrepresenting what module peership does, as well as the CA
> relationship.
> >
> > I linked you to the definition of Module Ownership, which highlights and
> > emphasizes that the module peer is simply a recognized helper. To the
> > extent there is any influence, it is through the public discussions here.
> > If your concern 

Re: Disallowed company name

2018-06-04 Thread James Burton via dev-security-policy
This company only cost £10. £6 for the incorporation. £4 for sending in
NE01 form to Companies House.
On Mon, 4 Jun 2018 at 08:58, Jeremy Rowley via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Punctuation differences are not enough to register a name in the us, or at
> least in the jurisdictions here I’m aware of.
>
> > On Jun 4, 2018, at 1:04 AM, Ryan Hurst via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
> >
> > I apologize, I originally wrote in haste and did not clearly state what I
> > was suggesting.
> >
> > Specifically, while it is typical for a given jurisdiction (state, etc)
> to
> > require a name to be unique, it is typically not a requirement for it to
> > not be so unique that it can not be confused for another name. For
> example,
> > I have seen businesses registered with punctuation and without; I have
> also
> > seen non-latin characters in use in business names this clearly has the
> > potential to introduce name confusion.
> >
> > Ryan
> >
> > On Fri, Jun 1, 2018 at 11:55 PM, Matthew Hardeman 
> > wrote:
> >
> >>
> >>
> >> On Fri, Jun 1, 2018 at 10:28 AM, Ryan Hurst via dev-security-policy <
> >> dev-security-policy@lists.mozilla.org> wrote:
> >>
> >>>
> >>> re: Most of the government offices responsible for approving entity
> >>> creation are concerned first and foremost with ensuring that a unique
> name
> >>> within their jurisdiction is chosen
> >>>
> >>> What makes you say that, most jurisdictions have no such requirement.
> >>>
> >>>
> >> This was anecdotal, based on my own experience with formation of various
> >> limited liability entities in several US states.
> >>
> >> Even my own state of Alabama, for example, (typically regarded as pretty
> >> backwards) has strong policies and procedures in place for this.
> >>
> >> In Alabama, formation of a limited liability entity whether a
> Corporation
> >> or LLC, etc, begins with a filing in the relevant county probate court
> of
> >> an Articles of Incorporation, Articles or Organization, trust formation
> >> documents, or similar.  As part of the mandatory filing package for
> those
> >> document types, a name reservation certificate (which will be validated
> by
> >> the probate court) from the Alabama Secretary of State will be required.
> >> The filer must obtain those directly from the appropriate office of the
> >> Alabama Secretary of State.  (It can be done online, with a credit card.
> >> The system enforces entity name uniqueness.)
> >>
> > ___
> > dev-security-policy mailing list
> > dev-security-policy@lists.mozilla.org
> > https://lists.mozilla.org/listinfo/dev-security-policy
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Disallowed company name

2018-06-02 Thread James Burton via dev-security-policy
I've spoke with a few UK banks about a opening bank account for ";" and
they are happy to proceed.

James Burton

On Fri, Jun 1, 2018 at 11:58 PM Matthew Hardeman 
wrote:

>
>
> On Thu, May 31, 2018 at 8:38 PM, Peter Gutmann 
> wrote:
>
>>
>> >Banks, trade vendors, etc, tend to reject accounts with names like this.
>>
>> Do they?
>>
>> https://www.flickr.com/photos/nzphoto/6038112443/
>
>
> I would hope that we could agree that there is generally a different risk
> management burden in getting a store loyalty tracking card versus getting a
> loan or even opening a business demand deposit account.
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Disallowed company name

2018-06-01 Thread James Burton via dev-security-policy
Hi Jeremy,

In the UK it would be class as “same as” and therefore wouldn’t be allowed
to be incorporated. You can see this in the links:

Companies Act 2006:
https://www.legislation.gov.uk/ukpga/2006/46/part/5/chapter/3

The Company, Limited Liability Partnership and Business (Names and Trading
Disclosures) Regulations 2015:
http://www.legislation.gov.uk/uksi/2015/17/regulation/7/made


James Burton

On Fri, 1 Jun 2018 at 20:32, Jeremy Rowley via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Can you point to a jurisdiction that allows you to register the same name?
> I've never seen an example where it's permitted. Maybe the UK?
>
> -Original Message-
> From: dev-security-policy  digicert@lists.mozilla.org> On Behalf Of Ryan Hurst via
> dev-security-policy
> Sent: Friday, June 1, 2018 9:28 AM
> To: mozilla-dev-security-pol...@lists.mozilla.org
> Subject: Re: Disallowed company name
>
> On Thursday, May 31, 2018 at 3:07:36 PM UTC-7, Matthew Hardeman wrote:
> > On Thu, May 31, 2018 at 4:18 PM, Peter Saint-Andre via
> > dev-security-policy < dev-security-policy@lists.mozilla.org> wrote:
> > >
> > >
> > > We can also think of many business types (e.g., scammers) that would
> > > love to have names like ⒶⓅⓅⓁⒺ but that doesn't mean it's smart to
> > > issue certificates with such names. The authorities who approve of
> > > company names don't necessarily have certificate handling in mind...
> > >
> >
> > Indeed.  Most of the government offices responsible for approving
> > entity creation are concerned first and foremost with ensuring that a
> > unique name within their jurisdiction is chosen and that a public
> > record of the entity creation exists.  They are not concerned with
> > risk management or legitimacy, broadly speaking.
> >
> > Anyone at any level of risk management in the rest of the ecosystem
> > around a business will be concerned with such matters.  Banks, trade
> > vendors, etc, tend to reject accounts with names like this.  Perhaps
> > CAs should look upon this similarly.
>
> re: Most of the government offices responsible for approving entity
> creation are concerned first and foremost with ensuring that a unique name
> within their jurisdiction is chosen
>
> What makes you say that, most jurisdictions have no such requirement.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
>
> https://clicktime.symantec.com/a/1/H8qZVRE5_iLNO8giNWdHECRPUnhWmem4t7fNC9FYfaI=?d=3yPd_yGx1m4dQz3H1uWi0wkNACGDGvIL4Z--LNoP9eDPIWeD0dhf9Ol_tFkJGBJFFgtnLt2HO_UCbFnaqQu3zUWQTHxGduRJO0a_H4yYE3qhYRX3wzvleMJ_cCcflYSP6doSbnmNReFJlR_Gjut8oNV6EnnecC1kzxXkdJG19OPUi3qjxKSp_r4Tlk3ExNNIwR3DF26nn1z6wKDyzP1siUdOGQT4oa70wTAPNZrK417n5z35ynmL65-hmQXBJkPLvbJL_UkzAgimEa4Sjh8YgHtKR2tCSas65vpsh0YyIXTny7Puzb8Hvs9uNxGPMfSyStkq2pMn3jZpzjfKsgYMMKDzdouOUktqhPACnhr6Qsx3ZdCTubWI8EkLpQsj4nYxjihAKD9mM-9LyUQGRh4mQOOQ0U4zY3qAE6fPOz-Upa5efnAlQhO0GtTkcOHiosY%3D=https%3A%2F%2Flists.mozilla.org%2Flistinfo%2Fdev-security-policy
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Disallowed company name

2018-05-31 Thread James Burton via dev-security-policy
I posted this also on the CAB Forum validation mailing list but I think
it's worthy of discussion on both lists.


I recently incorporated the company named ";", see:
https://beta.companieshouse.gov.uk/company/11363219. This company compiles with
the both the "Companies Act 2006" and "The Company, Limited Liability
Partnership and Business (Names and Trading Disclosures) Regulations 2015".


Now the current guidelines state that this type of name is not allowed due
to regulation 7.1.4.2.2 (j). I misinterpret this regulation and I thought
that because this name is complete and incorporated that the name should be
allowed in the O field. I was corrected by forum member.


This is wrong and should be changed to allow all types of legally
incorporated company names to get certificates. I understand this
doesn't fit any of the standard company name profiles you've seen but this
company name can be used in practice  and I can think of many business
types that would love this type of name.


Thank you


James Burton
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-13 Thread James Burton via dev-security-policy
Judges must follow the law to the letter and must not let personal feelings
influence their decision. The same rules apply to CAs. Every company who
passes the EV guidelines has the right to have an EV cert and CAs must be
impartial even if that cert might cause harm. If the CA doesn't like it
then file a ballot on the CAB Forum.

James Burton

On Fri, Apr 13, 2018 at 10:23 PM, Ryan Sleevi via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Fri, Apr 13, 2018 at 5:15 PM, Matthew Hardeman via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
> >
> > I only named Let's Encrypt as an example of a CA that maintains a
> scrubbing
> > "blacklist".  In their case, it appears to require exact match to a label
> > including TLD and TLD+1.  I was kind of surprised that they didn't just
> > take all the high value domain names as to the TLD+1 field and decline
> all
> > combinations of (0...n_labels.)HIGH_VALUE_TLD+1.ANY_TLD_HERE, but I'm
> sure
> > there's a reasonable case either way.
> >
>
> Reading the DNS policy discussions (over the past two decades) provides an
> adequately ample understanding of the problems with, and complexities of,
> such a naieve policy. The discussion around 'sunrise' and 'early
> registration' periods for TLDs, or the UDRP, should be mandatory
> comprehension for anyone arguing in favor of "popularity contests" or "big
> domain holders > small domain holders" or "trademark holders > free speech"
> or... well, the list goes on with the bad ideas proposed here that have
> been roundly rejected by civil society and technologists regarding the
> administration of the DNS :)
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread James Burton via dev-security-policy
We both work in the security space and yes, usually blocking a proof of
concept is good practice but in this situation I feel that revoking this
cert was heavy handed and unnecessary. The probability of Ian using the EV
certs for deceptive purposes was extremely low.

There are tons more ways of using EV certs for bad purposes.

James





On Thu, 12 Apr 2018 at 23:35, Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 12/04/2018 21:20, James Burton wrote:
> >   Both mine and Ian's demonstrations never harmed or deceived anyone as
> they
> > were proof of concept. The EV certs were properly validated to the
> > EV guidelines. Both companies are legitimate. So what's the issue? None.
> >
> >
>
> In the security space, blocking a proof of concept exploit is usually
> considered the right thing to do.  But doing so in a way that is
> entirely limited to the concrete example rather than the underlying
> problem is considered cheating.
>
> Consider, as an analogy, a hypothetical freedom of speech law whose only
> exception was that you must not shout "fire" in a packed theater.  Then
> an actor standing on stage making speech about the silliness of that law
> and then shouting "fire", with full warning of the audience to avoid
> panic, should not be surprised to get charged with the specific offense,
> as it was a deliberate test of the law.  Of cause, such an actor might
> deserve some leniency in the punishment, such as a $1 fine, but he
> should not be surprised the law is formally upheld.
>
>
>
> Enjoy
>
> Jakob
> --
> Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
> Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
> This public discussion message is non-binding and may contain errors.
> WiseMo - Remote Service Management for PCs, Phones and Embedded
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread James Burton via dev-security-policy
Here is another example of cross-country company name collision. Recently,
I incorporated to the company named "X Corporation" in the United Kingdom.
If someone incorporated the exactly same name in the US. The only
difference between mine and the other persons company in the EV indicator
is the 2 letter country code (in certain browsers). iOS and OSX doesn't
even display the country code in the EV indicator.

On Thu, Apr 12, 2018 at 8:35 PM, Matthew Hardeman 
wrote:

>
>
> On Thu, Apr 12, 2018 at 2:28 PM, Alex Gaynor  wrote:
>
>> All that proves is the entire EV model cannot possibly accomplish what
>> CAs claims (with respect to phishing and other similar concerns). To whit:
>>
>> - Two companies can validly possess trademarks for the same name in the
>> United States (and I assume other jurisdictions)
>> - A CA, or anyone else's ability to tell if the identity collision is
>> being used maliciously to deceive is totally based on seeing what content
>> is being served under that name; the reality of trademark law means that
>> two organizations with the same name is not inherently deceptive
>> - An actually malicious entity will not broadcast their name collision!
>> Instead they'd probably have a benign website that normal users see, and at
>> particular URLs sent to their victims, they'd serve the misleading content.
>>
>> In conclusion, revoking stripe.ian.sh while ignoring the broader issues
>> WRT the limitations of EV's binding of real world corporate identity to
>> domain control is security theater at its worst.
>>
>> Alex
>>
>>
> I do believe that the EV guidelines and program as it exists today need to
> change.  Clearly, the direction I would change it in is ideologically at
> odds with a majority of active participants who've weighed in to this point.
>
> Perhaps EV changes to require a seasoned history?
> Perhaps EV requires advance publication for scrutiny by the public and
> current holders?
> Perhaps EV requires active monitoring of the sites of the active corpus of
> certs by the issuing CAs?
>
> I'd rather see an optional enhanced trust indicator with reasonable
> guidelines and enforcement than have numerous charlatans manage to get one
> or more garbage ones incorporated into some moronic regulatory scheme.
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Sigh. stripe.ian.sh back with EV certificate for Stripe, Inc of Kentucky....

2018-04-12 Thread James Burton via dev-security-policy
 Both mine and Ian's demonstrations never harmed or deceived anyone as they
were proof of concept. The EV certs were properly validated to the
EV guidelines. Both companies are legitimate. So what's the issue? None.



On Thu, Apr 12, 2018 at 8:05 PM, Eric Mill via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Thu, Apr 12, 2018 at 2:57 PM, Eric Mill  wrote:
> >
> >
> > Of course, that would break his proof-of-concept exploit.  Which is the
> >> right outcome.  It demonstrates that an EV certificate used in a manner
> >> which might cause confusion will be revoked.  They're not stopping him
> from
> >> publishing.  He can still do that, without the benefit of an EV
> certificate.
> >>
> >
> > The stripe.ian.sh site itself is not likely to cause confusion, and was
> > not an exploit. Here's what stripe.ian.sh looks like right now:
> >
>
> (Inline images don't appear to play too well with m.d.s.p, so I've attached
> the image to this email.)
>
> --
> konklone.com | @konklone 
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Following up on Trustico: reseller practices and accountability

2018-03-05 Thread James Burton via dev-security-policy
It wouldn't stop someone from offering such a service and it also wouldn't
prevent users from using that CSR as it is their choice in the end. This
was just an idea.
CAs shouldn't be policing users. CAs should be instead enforcing best
practices on resellers as practices like this shaken user confidence to a
degree which affects all.




On Mon, Mar 5, 2018 at 2:15 PM, Ryan Sleevi <r...@sleevi.com> wrote:

> Considering that the Baseline Requirements explicitly acknowledge that the
> Applicant may delegate the obtaining of their certificates to a third-party
> (Applicant Representative), how would you propose that the applicant's
> agents (which, in a legal sense, is the name for their employees - that is,
> those with legal authority to represent the company) and resellers?
>
> What would stop  someone from offering a "CSR-as-a-service" in which they
> generate CSRs for users, and then users take that generated CSR to the CA?
> What role are you suggesting that the CA has to play in policing 'how' the
> CSR was generated - since a CSR is-a CSR is-a CSR?
>
> On Mon, Mar 5, 2018 at 8:26 AM, James Burton via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
>> Currently, resellers are allowed to submit CSRs on behalf of their
>> customers and as we've seen this is open to abuse. Maybe it's time to stop
>> this practice and restrict submission of CSRs to CA portals only.
>>
>> On Mon, Mar 5, 2018 at 12:51 PM, okaphone.elektronika--- via
>> dev-security-policy <dev-security-policy@lists.mozilla.org> wrote:
>>
>> > On Sunday, 4 March 2018 22:44:26 UTC+1, Paul Kehrer  wrote:
>> > > On March 4, 2018 at 5:06:41 PM, Eric Mill via dev-security-policy (
>> > > dev-security-policy@lists.mozilla.org) wrote:
>> > >
>> > > 
>> > >
>> > > It's also clear from the experience that rules of the road for
>> resellers
>> > > are unclear, and that accountability is limited. It seems possible, or
>> > > likely, that other resellers may also be mishandling customer keys
>> > >
>> > > So, what would useful next steps be to improve security and
>> > accountability
>> > > for resellers?
>> > >
>> > >
>> > > As you already suggested an official communication requesting
>> information
>> > > from the CAs about the way their reseller networks manage subscriber
>> key
>> > > material would be a good start. Eventually I think it's likely that
>> > > resellers need to be subject to some limited form of audit (maybe as
>> > > simplistic as a PCI self-assessment questionnaire?). While that
>> doesn't
>> > > prevent bad behavior it would generate an evidence trail for
>> termination
>> > of
>> > > relationships with incompetent/malicious actors.
>> >
>> > I'm not sure that that would be reasonable. After all resellers can have
>> > resellers, and so on so where would that end? With the end user having
>> to
>> > accept a "general license agreement"? And distrusting a reseller could
>> also
>> > be difficult.
>> >
>> > I think it will have to be/remain the responsibility of the CA to choose
>> > their reselllers in such a way that "not too many questions are being
>> > asked" about them.
>> >
>> >
>> > > Of course, CAs are likely to be reluctant to share a complete list of
>> > their
>> > > resellers since they probably consider that competitive information.
>> So,
>> > it
>> > > would be nice if we could just make it part of the CA's audits...
>> > >
>> > > One way to do that would be that the baseline requirements could be
>> > updated
>> > > to create a section defining requirements placed upon resellers
>> > (especially
>> > > around subscriber key management). This way CAs would be incentivized
>> to
>> > > manage their business relationships more carefully since their
>> business
>> > > partners could cause them audit issues. This has some precedent since
>> in
>> > > the past some resellers acted as RAs and had their own subordinates
>> -- a
>> > > practice that was terminated as they came under scrutiny and demands
>> for
>> > > audits.
>> > >
>> > > Mozilla, of course, cannot amend the BRs itself. However, past
>> evidence
>> > > suggests that if the Mozilla program introduces their own requirements
>> > > around reseller management and disclosure then the probability of a
>> CABF
>> > > ballot with similar restrictions passing is relatively high (thus
>> getting
>> > > it into the audit regime).
>> > >
>> > > -Paul
>> >
>> > ___
>> > dev-security-policy mailing list
>> > dev-security-policy@lists.mozilla.org
>> > https://lists.mozilla.org/listinfo/dev-security-policy
>> >
>> ___
>> dev-security-policy mailing list
>> dev-security-policy@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-security-policy
>>
>
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Following up on Trustico: reseller practices and accountability

2018-03-05 Thread James Burton via dev-security-policy
Currently, resellers are allowed to submit CSRs on behalf of their
customers and as we've seen this is open to abuse. Maybe it's time to stop
this practice and restrict submission of CSRs to CA portals only.

On Mon, Mar 5, 2018 at 12:51 PM, okaphone.elektronika--- via
dev-security-policy  wrote:

> On Sunday, 4 March 2018 22:44:26 UTC+1, Paul Kehrer  wrote:
> > On March 4, 2018 at 5:06:41 PM, Eric Mill via dev-security-policy (
> > dev-security-policy@lists.mozilla.org) wrote:
> >
> > 
> >
> > It's also clear from the experience that rules of the road for resellers
> > are unclear, and that accountability is limited. It seems possible, or
> > likely, that other resellers may also be mishandling customer keys
> >
> > So, what would useful next steps be to improve security and
> accountability
> > for resellers?
> >
> >
> > As you already suggested an official communication requesting information
> > from the CAs about the way their reseller networks manage subscriber key
> > material would be a good start. Eventually I think it's likely that
> > resellers need to be subject to some limited form of audit (maybe as
> > simplistic as a PCI self-assessment questionnaire?). While that doesn't
> > prevent bad behavior it would generate an evidence trail for termination
> of
> > relationships with incompetent/malicious actors.
>
> I'm not sure that that would be reasonable. After all resellers can have
> resellers, and so on so where would that end? With the end user having to
> accept a "general license agreement"? And distrusting a reseller could also
> be difficult.
>
> I think it will have to be/remain the responsibility of the CA to choose
> their reselllers in such a way that "not too many questions are being
> asked" about them.
>
>
> > Of course, CAs are likely to be reluctant to share a complete list of
> their
> > resellers since they probably consider that competitive information. So,
> it
> > would be nice if we could just make it part of the CA's audits...
> >
> > One way to do that would be that the baseline requirements could be
> updated
> > to create a section defining requirements placed upon resellers
> (especially
> > around subscriber key management). This way CAs would be incentivized to
> > manage their business relationships more carefully since their business
> > partners could cause them audit issues. This has some precedent since in
> > the past some resellers acted as RAs and had their own subordinates -- a
> > practice that was terminated as they came under scrutiny and demands for
> > audits.
> >
> > Mozilla, of course, cannot amend the BRs itself. However, past evidence
> > suggests that if the Mozilla program introduces their own requirements
> > around reseller management and disclosure then the probability of a CABF
> > ballot with similar restrictions passing is relatively high (thus getting
> > it into the audit regime).
> >
> > -Paul
>
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: CA Program for security researchers

2018-02-22 Thread James Burton via dev-security-policy
It doesn't take that long for a CAs to do vetting checks for OV and EV
certificates when everything is handed to them on a plate. Breaking CAs
vetting procedures is not too hard.

The key here is that security research shouldn't cost the
researcher thousands to prove a valid point. They should be entitled to
some type of compensation from the CA.
It would be great if CAs ran a program that allowed security researchers to
get compensated after the research instead of before.

James

On Thu, Feb 22, 2018 at 10:10 PM, Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 22/02/2018 22:17, James Burton wrote:
>
>> There needs to be a program that helps security researchers like myself
>> get
>> free or low cost certificates for research purposes. That EV research I
>> did
>> a while ago nearly set me back personally $4,297.
>>
>> James
>>
>>
> I think there are three main cases and an additional concern:
>
> 1. Getting real certificates from a real CA referring to real domains.
>   Only secure option is to get the research sponsored by that CA,
>   perhaps in exchange for giving them a longer than standard heads up of
>   any results regarding their security.
>
> 2. Getting real certificates for a test/dummy domain.
>   Perhaps a weakening rule can be introduced in the BRs (subject o a lot
>   of discussions as this will be very controversial and potentially
>   dangerous), that certificates for the .invalid TLD can be issued under
>   special research terms.  However I doubt the current BR maintainers or
>   the leaders of this Mozilla group will agree to that.
>
> 3. Getting invalid/test certificates for a real domain to test
>   procedures.
>Perhaps some CAs can be talked into setting up a special "test only,
>   DO NOT TRUST" root CA running in parallel to their real trusted roots,
>   allowing cheap issuance for tests and experiments.  Such a test root
>   would not be in the CCADB or any root program, nor be cross-signed by
>   any real roots.
>Such a test hierarchy would also be useful for organizations setting
>   up and testing automated certificate management systems prior to using
>   those systems with real certificates.
>
> Additionally, for the manual step verified EV and OV certificates,
> issuance involves real man-hours at the CA organization.  So for such
> higher grade certificates, getting them for free or on a 30 days-return
> policy would not be a good thing to allow.  Even for testing.
> Especially since such research certificates are probably going to
> trigger additional manual revocation procedures (= more man-hours to be
> paid).
>
>
>
>
> Enjoy
>
> Jakob
> --
> Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
> Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
> This public discussion message is non-binding and may contain errors.
> WiseMo - Remote Service Management for PCs, Phones and Embedded
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: CA Program for security researchers

2018-02-22 Thread James Burton via dev-security-policy
I didn't put this in the article because it's not relevant as an attacker
wouldn't care nonetheless.

James

On Thu, Feb 22, 2018 at 9:29 PM, James Burton  wrote:

> They tried charging the card the amount the day after the certificate was
> issued but the bank fraud department called me about the transaction and I
> refused it because it was invalid as it was within the trial period and it
> was clearly stipulated that I was only going to get charged after the 30
> days trial period is up. In the end, I managed to sort it out with them and
> didn't have to pay anything and had evidence to support myself in case I
> had to fight it in court or etc.
>
> James
>
> On Thu, Feb 22, 2018 at 9:17 PM, James Burton  wrote:
>
>> There needs to be a program that helps security researchers like myself
>> get free or low cost certificates for research purposes. That EV research I
>> did a while ago nearly set me back personally $4,297.
>>
>> James
>>
>>
>>
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: CA Program for security researchers

2018-02-22 Thread James Burton via dev-security-policy
They tried charging the card the amount the day after the certificate was
issued but the bank fraud department called me about the transaction and I
refused it because it was invalid as it was within the trial period and it
was clearly stipulated that I was only going to get charged after the 30
days trial period is up. In the end, I managed to sort it out with them and
didn't have to pay anything and had evidence to support myself in case I
had to fight it in court or etc.

James

On Thu, Feb 22, 2018 at 9:17 PM, James Burton  wrote:

> There needs to be a program that helps security researchers like myself
> get free or low cost certificates for research purposes. That EV research I
> did a while ago nearly set me back personally $4,297.
>
> James
>
>
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


CA Program for security researchers

2018-02-22 Thread James Burton via dev-security-policy
There needs to be a program that helps security researchers like myself get
free or low cost certificates for research purposes. That EV research I did
a while ago nearly set me back personally $4,297.

James
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


RE: Misissuance/non-compliance remediation timelines

2018-02-07 Thread James Burton via dev-security-policy
The idea of a grading system being used to judge CAs compliance will be a total 
disaster. We should instead be focusing our efforts on more transparency.

James


-Original Message-
From: dev-security-policy 
[mailto:dev-security-policy-bounces+jb=0.me...@lists.mozilla.org] On Behalf Of 
Tim Hollebeek via dev-security-policy
Sent: 07 February 2018 16:11
To: Alex Gaynor 
Cc: mozilla-dev-security-pol...@lists.mozilla.org; Paul Kehrer 

Subject: RE: Misissuance/non-compliance remediation timelines

Alex,

 

Most CAs probably wouldn’t aim for an A.  I don’t think doing this would be a 
game changer.

 

However there are some CAs that would.  And I think that would be a positive 
thing, and lead to more innovation in best practices that could become 
mandatory for everyone over time.

 

And I don’t disagree with you that action is needed on those who are currently 
getting Ds.  I’m very disturbed by the behavior of about half of the CAs in the 
industry.

 

-Tim

 

From: Alex Gaynor [mailto:agay...@mozilla.com]
Sent: Wednesday, February 7, 2018 8:15 AM
To: Tim Hollebeek 
Cc: Paul Kehrer ; 
mozilla-dev-security-pol...@lists.mozilla.org
Subject: Re: Misissuance/non-compliance remediation timelines

 

Hey Tim,

 

A piece I think I'm missing is what you see as the incentive for CAs to aim for 
an "A" rather than being happy to have a "B". It reminds me of the old joke: 
What do you call the Dr^W CA who graduated with a C average? Dr.^W trusted to 
assert www-wide identity :-)

 

That said, given the issues Paul highlighted in his original mail (which I 
wholeheartedly concur with), it seems the place to focus is the folks who are 
getting Ds right now. Therefore I think the essential part of your email is 
your agreement that CAs which are persistently low performing need to be 
recognized and potentially penalized for the sum total of their behaviors.

 

Alex

 

On Tue, Feb 6, 2018 at 8:30 PM, Tim Hollebeek via dev-security-policy 
 > wrote:

Paul,

I understand your frustration.  I've read some of the recent threads about "how 
long does it take to update a CPS?" and clearly there needs to be some stronger 
compliance language in either the BRs or Mozilla policy ("CAs MUST be able to 
update their CPS within 90 days").  And as you note such policies need to have 
teeth otherwise there will be some who will just ignore them.

However  negative penalties are not the only thing that should be considered.
Mozilla should also have some way of recognizing CAs that are performing above 
and beyond the minimum requirements.  I would love to see Mozilla encourage CAs 
to compete to be the best CA in Mozilla's program.

To satisfy both goals, I'd like to suggest an idea I've had for a while: at 
some point in time (annually?), Mozilla should assess their opinion of how well 
each CA in the program is performing, and give them a letter grade.  This could 
include policy improvements like "Two consecutive failing grades, or three 
consecutive C or lower grades and you're out of the Mozilla program."

This would not preclude other actions as Mozilla deems necessary.  But it would 
provide a regular checkpoint for CAs to understand either "Hey, you're great, 
keep up the good work!" or "Meh, we think you're ok." or "Your performance to 
date is unacceptable.  Get your sh*t together or you're gone."

-Tim


> -Original Message-
> From: dev-security-policy [mailto:dev-security-policy- 
> 
> bounces+tim.hollebeek=digicert@lists.mozilla.org 
> bounces+ ] On Behalf Of Paul
> Kehrer via dev-security-policy
> Sent: Tuesday, February 6, 2018 6:03 PM
> To: mozilla-dev-security-pol...@lists.mozilla.org 
> 
> Subject: Misissuance/non-compliance remediation timelines
>
> A bit over 5 months ago I reported a series of OCSP responders that 
> were violating BRs (section 4.9.10) by returning GOOD on unknown 
> serial
numbers.
> Since that time the majority of those responder endpoints have been 
> fixed,
but
> several are still non-compliant; either with little communication or
continuing
> assurances that it will be fixed "soon", where soon is a date that
continuously
> slides into the future.
>
> At the moment Mozilla possesses very few options when it comes to 
> punitive action and the lesson some CAs appear to be learning is that 
> as long as
you
> don't rise to PROCERT levels of malfeasance/incompetence then the 
> maximum penalty is censure on bugzilla and email threads. Clearly this 
> is not a
deterrent.
>
> So, how long is too long? At what point should a CA incur consequences
(and
> what form can those consequences take) for failure to remediate 
> despite
being
> given such immense latitude?
>
> 

Re: ccadb.org

2018-01-29 Thread James Burton via dev-security-policy
 Hi Jonathan,

I haven't got the required permission to access bug 1376996.

Thank you,

James


On Tue, Jan 30, 2018 at 12:57 AM, Jonathan Rudenberg <jonat...@titanous.com>
wrote:

>
> > On Jan 29, 2018, at 19:48, James Burton via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
> >
> > I was doing research on the ccadb.org site and was surprised to find
> that
> > the site is running only in HTTP and is not using HTTPS.
>
> There is already a bug about this: https://bugzilla.mozilla.org/
> show_bug.cgi?id=1376996
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


ccadb.org

2018-01-29 Thread James Burton via dev-security-policy
I was doing research on the ccadb.org site and was surprised to find that
the site is running only in HTTP and is not using HTTPS. Now, I understand
that GitHub pages don't support HTTPS for custom domains but you could
always use CloudFlare for HTTPS support in the meantime until GitHub
enables HTTPS for custom domains.

James
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Summary of Responses to the November CA Communication

2018-01-26 Thread James Burton via dev-security-policy
You really should set up a emergency conference call with all members of
the CAB Forums and talk about these issues with chair. If you and other
members feel that the answers are not satisfactory then you can vote
to remove the Chair for dereliction of duty and place the sub-Chair in
charge of the forums until the end of current term or until you appoint new
chair.

James

On Fri, Jan 26, 2018 at 1:58 PM, Ryan Sleevi via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On Fri, Jan 26, 2018 at 5:43 AM Gervase Markham  wrote:
>
> > On 24/01/18 13:56, Ryan Sleevi wrote:
> > >> more frequently when requirements change. I propose that we require
> CAs
> > to
> > >> update their CPS to comply with version 2.5 of the Mozilla root store
> > >> policy no later than 15-April 2018.
> >
> > I think Ryan is right here; the deadline for complying with most of the
> > new changes was "immediately" - in part, that was due to the nature of
> > the changes, in that this was possible, and also we put out a call for
> > "does anyone need an implementation period for any of these things", and
> > the only response was from Globalsign, which led to the modification of
> > the email intermediate compliance dates.
> >
> > I realise that updating one's CPS to match changes in practice can't be
> > done overnight - there are change control procedures - but taking 15
> > months is ridiculous. We should get back to Microsec and tell them that
> > this is unacceptable. If we do set a "new" deadline for CPS updates, it
> > should be closer than mid-April, and we should update our policy to make
> > it clear how fast we expect CPSes to be updated in the wake of
> > "immediate" new requirements - either from a new version of the policy,
> > or from some emergency action we take.
> >
> > > 2 should be inconsequential, but 1 has a very real effect -
> unless/until
> > > the CA updates their CP/CPS to explicitly state what methods they are
> > using
> > > (implicitly disavowing the 'any other method'), then a CA can receive a
> > > fully compliant audit, despite actively issuing certificates using 'any
> > > other method', in contravention of Mozilla Policy.
> >
> > Ryan: I thought you had previously made the case that all CAs actually
> > had to abide by the latest version of the BRs? If that is so, then
> > surely your point above is incorrect?
>
>
> The CA/Browser Forum chair - an employee of Entrust - has been
> irresponsibly derelict in his duties as Chair. He has failed to exercise
> the Bylaws as required of him, has failed to inform the Forum about the IP
> notices received (if any) or to update the Public Mail List and Public Web
> Site, and as a result, created a situation where it is defensibly ambiguous
> as to whether only the “10 Blessed Methods” are used.
>
> It is unclear whether this is due to incompetence or malice, but the
> consequence is such that a CA, such as Entrust, could attempt to argue that
> the CA/Browser Forum did not declare a version of the BRs post-Ballot 190
> as Final, and thus avoid triggering that clause.
>
> The above remarks I highlighted would have allowed for a defense in depth
> from the CA/Browser Forum chair failing or abusing their position, by
> ensuring clear and unambiguous statements by CAs for security critical
> matters.
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Retirement of RSA-2048

2018-01-20 Thread James Burton via dev-security-policy
Approximate date of retirement of RSA-2048?
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: On the value of EV

2017-12-11 Thread James Burton via dev-security-policy
EV is on borrowed time and deprecating EV is the most logical viable
solution right now and brings us one step forward in vanishing the old
broken web security frameworks of the past. Now that both me and Ian
have demonstrated the fundamental issues with EV and the way its displayed
in the UI, it's only time until the REAL phishing starts with EV.

James

On Mon, Dec 11, 2017 at 8:29 PM, Matthew Hardeman via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> The question that I have is whether the community might consider it
> in-scope to discuss enhancements (even fixes) to EV to arrive at assurance
> commensurate to its handling.
>
> Matt Hardeman
>
> On Mon, Dec 11, 2017 at 2:09 PM, Ryan Sleevi via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
>
> > On Mon, Dec 11, 2017 at 2:50 PM, Tim Hollebeek <
> tim.holleb...@digicert.com
> > >
> > wrote:
> >
> > >
> > >
> > > Certainly, as you noted, one option is to improve EV beyond simply
> being
> > > an assertion of legal existence.
> > >
> >
> > Does this mean we're in agreement that EV doesn't provide value to
> justify
> > the UI then? ;-)
> >
> > I say it loaded and facetiously, but I think we'd need to be honest and
> > open that if we're saying something needs to be 'more' than EV, in order
> to
> > be useful and meaningful to users - which is what justifies the UI
> surface,
> > versus being useful to others, as Matt highlighted - then either EV meets
> > the bar of UI utility or it doesn't. And if it doesn't, then orthogonal
> to
> > and separate from efforts to add "Validation ++" (whether they be QWACS
> in
> > eIDAS terms or something else), then there's no value in the UI surface
> > today, and whether there's any value in UI surface in that Validation++
> > should be evaluated on the merits of Validation++'s proposals, and not by
> > invoking EV or grandfathering it in.
> > ___
> > dev-security-policy mailing list
> > dev-security-policy@lists.mozilla.org
> > https://lists.mozilla.org/listinfo/dev-security-policy
> >
> ___
> dev-security-policy mailing list
> dev-security-policy@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: DigiCert-Symantec Announcement

2017-09-20 Thread James Burton via dev-security-policy
Hi Jeremy,

Is DigiCert planning on continuing selling DV certificates after the 
transition? As DigiCert has previously been vocal on the fact that the 
drawbacks of issuing DV certificates outweigh the benefits as stated here: 
https://www.digicert.com/dv-ssl-certificate.htm. If DigiCert is going to issue 
DV certificates which root or roots are you going to dedicated for the 
certificates?

James
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: StartCom inclusion request: next steps

2017-09-18 Thread James Burton via dev-security-policy
On Monday, September 18, 2017 at 11:38:57 AM UTC+1, Inigo Barreira wrote:
> > 
> > I want to give you some words from one of the "community side" (this is a
> > personal opinion and may vary from other opinions inside the community).
> > 
> > Trust is not something that you get, it is something that you earn.
> 
> True
> 
> > StartCom was distrusted because of serious issues with their old PKI and now
> > had the chance to restart - there are serious issues again. I don't think 
> > that
> > the "community" wants rogue CAs on its list just because they restarted with
> > new certificates.
> > 
> > - The fact that you were cross-signed by Certnomis before you had valid
> > WebTrust Audits and the permission to issue trusted certificates again and
> > that the only thing which prevented you from using the trust path is a 
> > PUBLIC
> > certificate? Is the only thing that prevents me from entering your 
> > datacenter
> > a sign which tells me not to do so and the fact that you did not tell me 
> > where
> > your datacenter is located?
> > 
> > - Startcom operates/operated multiple CT Log Server itself. There is 
> > absolutely
> > no reason to use trusted certificates for testing purposes if he does have a
> > testing infrastructure. It would be easy for you to add one of your testing
> > roots to your CT Logs and then test your CT behaviour. I don't think that
> > Googles CTs are different from your own ones. Though your certificates might
> > not have been trusted at that time, they would be now and as Gerv said, test
> > certificates are not allowed. If you did not care about compliance at that
> > time, why should you care about it now?
> 
> Those certificates were not trusted at that time and can´t be now because 
> they were revoked within minutes.
> 
> > 
> > - There is a reason why Best Practices are called best practices. Why did 
> > you
> > reuse your key in root and intermediate certificates? 
> > Because there is nomoney for additional HSMs? Because you don't know how to 
> > generate new
> > keys? An explanation would be great.
> 
> A new thread has started about this. It´s not forbidden.
> 
> > 
> > - P-521 are forbidden by Mozilla. Even if there is a discussion to change 
> > this it
> > does not allow you to take that as a permission to test it. The fact that 
> > these
> > certificates were reported as unrevoked at the time of reporting (as far as 
> > I
> > remember) does imply that you do not monitor your certificate issuances for
> > policy compliance at all. What do you do to ensure that all of your 
> > certificates
> > are compliant with all requirements at all times?
> 
> At the time of application, the certificates were revoked and countermeasures 
> set and since then there´s no more issues. We have implemented cablint, 
> crt.sh, ... and some other tools into our issuance process and still trying 
> to improve much more. 
> I´m not trying to excuse we had issues but we corrected them.
> 
> > 
> > - What internal audits have you done to ensure the integrity of your 
> > systems?
> > If something so critically as the permission to issue certificates in EJBCA 
> > is
> > only noticed after you explicitly looked for it, what happens if someone
> > removes all of your security mechanisms? You will find that out too after 
> > you
> > misissued thousands of certificates? Quis custodiet ipsos custodes.
> 
> Despite all the terrible systems we have, etc. we haven´t misissued thousands 
> of certificates, nor hundreds. The issues we had, have been fixed.  Those 
> test certs issued directly from the EJBCA was a mistake, explained many 
> times. I have nothing to add to what I´ve already said. It was not a good 
> decission, not a good practice, and it´s forbidden.
> 
> > 
> > - The incidents with Diginotar should have made clear that secure, well
> > audited and hardened code is absolutely necessary as well as reliable logs.
> > The fact that these flaws where not found by your internal team and only
> > discovered after an external company tested your systems is deeply
> > concerning. What have you done now and what will you do to ensure that
> > your systems won't be abused? How do you make sure that the code your
> > people write in the future is safe and how do you detect security problems 
> > if
> > you were unable to do so at the first time?
> 
> This is a different example, Diginotar was attacked and the attacker was able 
> to enter in their systems, and this is not what happened with StartCom. As 
> said, the code that went live is not the same that was audited the first time 
> and has been improved since then. The audits are just for that, and we will 
> continue doing yearly security audits to improve our systems.

Why not open-source the code on GitHub and let us be the judge of the 
improvements made to your systems code? Lets encrypt does this and works 
successfully. 

> > 
> > Though I would love to see StartCom up and running again, I have to agree
> > with James that 

Re: FW: StartCom inclusion request: next steps

2017-09-15 Thread James Burton via dev-security-policy
On Friday, September 15, 2017 at 12:30:00 PM UTC+1, James Burton wrote:
> On Friday, September 15, 2017 at 10:56:11 AM UTC+1, Inigo Barreira wrote:
> > > 
> > > > Those tests were done to check the CT behaviour, there was any other
> > > testing of the new systems, just for the CT. Those certs were under 
> > > control all
> > > the time and were lived for some minutes because were revoked inmediately
> > > after checking the certs were logged correctly in the CTs. It´s not a mis-
> > > issuance by means of we didn´t know what happened, we had to investigate,
> > > etc. It was not a good practice and I can´t excuse for that, but it was 
> > > not
> > > related to the regular issuance procedure as someone suggested. We
> > > provided a report in which indicated all that happened and what we did to
> > > not happen this again, updating the EJBCA roles permissions.
> > > 
> > > 1) Why didn't StartCom build a test hierarchy?
> > 
> > Considering that we were distrusted, that we didn´t reapply for inclussion, 
> > that CT is only required by Chrome and it´s not included in the Mozilla 
> > policy (even we were requested that all of our certs had to be CT logged) 
> > nor required by Firefox, that those certs were under our control all the 
> > time and lived for some minutes because were revoked inmediately, at that 
> > time, when we did it, we didn´t expect this reaction for sure.
> > 
> > Of course if we had known it we hadn´t done it and for sure had built a 
> > test hierarchy but there´s nothing we can do now. Only wanted to state that 
> > those certs were under our control all the time, and lived for some minutes 
> > because were revoked after the test. There were not any other testing of 
> > any other nature directly in the production system
> > 
> > > 2) Why didn't StartCom use the TestTube CT log for testing CT?
> > 
> > We tried to check and test the same behaviour before going live with the CT 
> > logging, so followed the requirements to use 3 logs, one google and one 
> > non-google, for the EVs and this is what we did. We used the same settings 
> > we had before the distrust using the startcom log and the google ones 
> > (pilot and rocketeer).
> 
> Hi Inigo,
> 
> I'm just trying to get everything straight, cleared up and then we can move 
> on.
> 
> I found all of your answers very concerning in the way you've conducted 
> testing. I thought that all CAs must have some type of test hierarchy in 
> place to test new software, requirements and etc before going live but from 
> the answers you've given, StartCom neglected this in favour to test as you go 
> along and deal with the problems using a live CA hierarchy. All this feels 
> very amateurish and doesn't give me the any confidence at all that your CA is 
> proven itself to be a trustworthy part of this infrastructure.
> 
> I recommend to Mozilla to require StartCom to start again fresh. 
> 
> 1. Build a test hierarchy. Test the software and etc to industry guidelines.
> 2. Build a new production hierarchy (including new HSMs, keys, roots, etc.) 
> and then re-apply for inclusion into the Mozilla root program.
> 3. Once approved then you can cross-sign your roots with another CA.
> 
> While this is happening. You can resell certificates from other CAs and build 
> up trust in the industry which will benefit you in the long term I feel.
> 
> James

Hi Inigo,

To add from the last post.

I know this is unwelcome news to you but I feel that with all these incidents 
happening right now with Symantec and the incidents before, we can't really 
take any more chances. Every incident is eroding trust in this system and if we 
want more people to take up adoption of https in the long term, I feel that we 
need to start operating infrastructure above board and without issues.

James
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: FW: StartCom inclusion request: next steps

2017-09-15 Thread James Burton via dev-security-policy
On Friday, September 15, 2017 at 10:56:11 AM UTC+1, Inigo Barreira wrote:
> > 
> > > Those tests were done to check the CT behaviour, there was any other
> > testing of the new systems, just for the CT. Those certs were under control 
> > all
> > the time and were lived for some minutes because were revoked inmediately
> > after checking the certs were logged correctly in the CTs. It´s not a mis-
> > issuance by means of we didn´t know what happened, we had to investigate,
> > etc. It was not a good practice and I can´t excuse for that, but it was not
> > related to the regular issuance procedure as someone suggested. We
> > provided a report in which indicated all that happened and what we did to
> > not happen this again, updating the EJBCA roles permissions.
> > 
> > 1) Why didn't StartCom build a test hierarchy?
> 
> Considering that we were distrusted, that we didn´t reapply for inclussion, 
> that CT is only required by Chrome and it´s not included in the Mozilla 
> policy (even we were requested that all of our certs had to be CT logged) nor 
> required by Firefox, that those certs were under our control all the time and 
> lived for some minutes because were revoked inmediately, at that time, when 
> we did it, we didn´t expect this reaction for sure.
> 
> Of course if we had known it we hadn´t done it and for sure had built a test 
> hierarchy but there´s nothing we can do now. Only wanted to state that those 
> certs were under our control all the time, and lived for some minutes because 
> were revoked after the test. There were not any other testing of any other 
> nature directly in the production system
> 
> > 2) Why didn't StartCom use the TestTube CT log for testing CT?
> 
> We tried to check and test the same behaviour before going live with the CT 
> logging, so followed the requirements to use 3 logs, one google and one 
> non-google, for the EVs and this is what we did. We used the same settings we 
> had before the distrust using the startcom log and the google ones (pilot and 
> rocketeer).

Hi Inigo,

I'm just trying to get everything straight, cleared up and then we can move on.

I found all of your answers very concerning in the way you've conducted 
testing. I thought that all CAs must have some type of test hierarchy in place 
to test new software, requirements and etc before going live but from the 
answers you've given, StartCom neglected this in favour to test as you go along 
and deal with the problems using a live CA hierarchy. All this feels very 
amateurish and doesn't give me the any confidence at all that your CA is proven 
itself to be a trustworthy part of this infrastructure.

I recommend to Mozilla to require StartCom to start again fresh. 

1. Build a test hierarchy. Test the software and etc to industry guidelines.
2. Build a new production hierarchy (including new HSMs, keys, roots, etc.) and 
then re-apply for inclusion into the Mozilla root program.
3. Once approved then you can cross-sign your roots with another CA.

While this is happening. You can resell certificates from other CAs and build 
up trust in the industry which will benefit you in the long term I feel.

James
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: FW: StartCom inclusion request: next steps

2017-09-15 Thread James Burton via dev-security-policy
> Those tests were done to check the CT behaviour, there was any other testing 
> of the new systems, just for the CT. Those certs were under control all the 
> time and were lived for some minutes because were revoked inmediately after 
> checking the certs were logged correctly in the CTs. It´s not a mis-issuance 
> by means of we didn´t know what happened, we had to investigate, etc. It was 
> not a good practice and I can´t excuse for that, but it was not related to 
> the regular issuance procedure as someone suggested. We provided a report in 
> which indicated all that happened and what we did to not happen this again, 
> updating the EJBCA roles permissions.

1) Why didn't StartCom build a test hierarchy? 
2) Why didn't StartCom use the TestTube CT log for testing CT? 
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


CloudFlare Issuing SHA-1 SSL Certificates

2017-04-15 Thread James Burton via dev-security-policy
CloudFlare has been issuing SHA-1 SSL Certificates from CloudFlare Inc 
Compatibility CA-3 which is BR violation. See: 
https://crt.sh/?CN=%25=34007

Thank you

James Burton
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy