Re: Improving X.509 certificate validation errors

2020-03-25 Thread Benjamin Kaduk
Hi Martin,

Hopefully this response is still useful a few weeks later.

On Thu, Mar 05, 2020 at 04:10:10PM +0100, Martin Ukrop wrote:
> Hi,
> 
> I’m the lead of a university project investigating (and improving) the
> usability of certificate validation errors. Our goal is to simplify the
> ecosystem by consolidating the errors and their documentation in one place,
> providing replicable example certificates for all validation errors and by
> explaining better what the individual errors mean. The project is live at
> https://x509errors.org/
> 
> Now we are reaching out to library developers and users (you) to ask for
> feedback.
> 
> Currently, we base the system on OpenSSL errors (as it’s the most common).
> We have example certificates for 30+ OpenSSL errors and in-progress mapping
> for corresponding errors error for OpenSSL, GnuTLS, Botan and MbedTLS.
> In the future, we plan the possibility of web reorganization based on the
> other libraries (currently, the web is organized by OpenSSL), adding the
> error frequencies based on IP-wide scans and elaborating on the
> consequences of individual errors.
> Ultimately, we want to propose better (ideally user-tested) errors and
> their documentation. (Just recently, we made a survey among 180 developers
> regarding their error documentation preference with good reception).
> 
> As developers/users of TLS libraries, what do you think of the idea?
> * Which part(s) do you find the most/least useful?
> * Is there anything you see missing?
> * What are your thoughts on unifying the error taxonomy? (a very long-term
> goal, if at all possible)

I tihnk it's an interesting idea.  To me, perhaps the most valuable part
would be to accumulate a corpus of certificates/chains that are malformed
or fail to validate due to a wide variety of errors, almost akin to a
fuzzing corpus.  I'd also be curious (though I'm not entirely sure how
large a practical impact it would have) to perform a clustering analysis
across different X.509 implementations and see if different implementations
produce different distributions of errors.  (That is, we might expect each
implementation to have an error for "not valid yet", "expired", "missing
required ASN.1 field", etc.; each implementation will have a different
error string, of course, but if we group all certificates that produce the
same error with the same implementation together, we have a bunch of
different clusters.  Repeating the clustering across all implementations
lets us compare the different distributions, and examine certificates that
end up in a different cluster in different implementations.)
I also like the idea of getting a sense for which types of errors are "most
common", though it will probably require some care to construct the
sample population for the experiment so that the results have interprative
value.  Those results might (depending on what they are) be used as input
to "best practice" guides for (e.g.) making a local PKI.

> During spring, we would like to start creating pull requests improving the
> documentation and error messages in some of the libraries. Would you
> welcome such contributions?

It's probably best to make sure everyone agrees what is meant by
"improving" before doing much writing work; a github issue would be a fine
place to discuss that topic.  I expect that our current error messages are
suboptimal, though we will have to keep in mind during any attempt to
change them that in some cases there will be more value in keeping things
stable than in making them better messages in isolation -- as an
open-source project, it's hard for us to know with confidence which of our
behaviors people are relying on.

-Ben


Alpha1 progress

2020-03-25 Thread Matt Caswell
Yesterday a number of us had a teleconference to update our task
tracking for the 3.0 release. The current spreadsheet gives us the
following dates for the various alpha/beta releases:

Alpha1: 2020-04-15
Alpha2: 2020-05-04
Alpha3: 2020-06-10
Beta1:  2020-06-12

Comparing this to the official timeline here:
https://www.openssl.org/policies/releasestrat.html

Which says:

Alpha1: 2020-03-31
Alpha2: 2020-04-21
Alpha3: 2020-05-21
Beta1: 2020-06-02

Until quite recently we were tracking fairly closely to the target
dates, but the last week or so has seen us drift out a bit. As can be
seen from the above we're about 2 weeks out at the moment. This is
primarily due to the key generation work being more complicated and
significant than we had anticipated.

So, right now, it looks to me like we won't be releasing alpha1 next
week as originally planned.

Matt