On Thu, Aug 22, 2019 at 10:29 PM Jeremy Rowley via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> I posted this tonight:
> https://bugzilla.mozilla.org/show_bug.cgi?id=1576013. It's sort of an
> extension of the "some-state" issue, but with the incorporation information
> of an EV cert.  The tl;dr of the bug is that sometimes the information
> isn't perfect because of user entry issues.
> What I was hoping to do is have the system automatically populate the
> jurisdiction information based on the incorporation information. For
> example, if you use the Delaware secretary of state as the source, then the
> system should auto-populate Delaware as the State and US as the
> jurisdiction. And it does...with some.
> However, you do you have jurisdictions like Germany that consolidate
> incorporation information to www.handelsregister.de<
> http://www.handelsregister.de> so you can't actually tell which area is
> the incorporation jurisdiction until you do a search. Thus, the fields to
> allow some user input. That user input is what hurts.   In the end, we're
> implementing an address check that verifies the locality/state/country
> combination.

Could you highlight a bit more your proposal here? My understanding is
that, despite the Handelsregister ("Commercial Register") being available
at a country level, it's further subdivided into a list of couunty or
region - e.g. the Amtsgericht Herne ("Local Court Herne").

It sounds like you're still preparing to allow for manual/human input, and
simply consistency checking. Is there a reason to not use an
allowlist-based approach, in which your Registration Agents may only select
from an approved list of County/Region/Locality managed by your Compliance

That, of course, still allows for human error. Using the excellent example
of the Handelsregister, perhaps you could describe a bit more the flow a
Validation Specialist would go through. Are they expected to examine a
faxed hardcopy? Or do they go to handelsregister.de and look up via the
registration code?

I ask, because it strikes me that this could be an example where a CA could
further improve automation. For example, it's not difficult to imagine that
a locally-developed extension could know the webpages used for validation
of the information, and extract the salient info, when that information is
not easily encoded in a URL. For those not familiar, Handelsregister
encodes the parameters via form POST, a fairly common approach for these
company registers, and thus makes it difficult to store a canonical
resource URL for, say, a server-to-server retrieval. This would help you
quickly and systematically identify the relevant jurisdiction and court,
and in a way that doesn't involve human error.

I'm curious how well that approach generalizes, and/or what challenges may
exist. I totally understand that for registries which solely use hard
copies, this is a far more difficult task than it needs to be, and thus an
element of human review. However, depending on how prevalent the hardcopy
vs online copy is, we might be able to pursue automation for more, and thus
increase the stringency for the exceptions that do involve physical copies.

> The more interesting part (in my opinion) is how to find and address these
> certs. Right now, every time we have an issue or whenever a guideline
> changes we write a lot of code, pull a lot of certs, and spend a lot of
> time reviewing. Instead of doing this every time, we're going to develop a
> tool that will run automatically every time we change a validation rule to
> find everything else that will fail the new update rules. IN essence,
> building unit tests on the data. What I like about this approach is it ends
> up building a system that lets us see how all the rule changes interplay
> since sometimes they may intercept in weird ways. It'll also let us easier
> measure impact of changes on the system. Anyway, I like the idea. Thought
> I'd share it here to get feedback and suggestions for improvement. Still in
> spec phase, but I can share more info as it gets developed.

This sounds like a great idea, and would love to know more details here.
For example, what's the process now for identifying these
jurisdictionOfIncorporation issues? How would it improve or change with
this system?

You describe it as "validation rule" changes - and I'm not sure if you're
talking about the BRs (i.e. "we validated this org at time X") or something
else. I'm not sure whether you're adding additional data, or formalizing
checks on existing data. More details here could definitely help try and
generalize it, and might be able to formalize it as a best practice.
Alternatively, even if we can't formalize it as a requirement, it may be
able to use as the basis when evaluating potential impact or cost of
changes (to policy or the BRs) in the future. That is, "any CA that has
implemented (system you describe) should be able to provide quantifiable
data about the impact of (proposed change X). If CAs cannot do so (because
they did not implement the change), their feedback and concerns will not be
dev-security-policy mailing list

Reply via email to