Hello everyone! This evening I have some very good news. That is, as the title says, I finally managed to get the unit test written to test signature verification for the ruleset updater to pass! This is pretty great news as it basically means that the last hurdle/task/problem in my Google Summer of Code project has finally been overcome/completed/solved.
The following is a detailed explanation of the work I've been doing over the last two weeks, the problems I've been running up against, and the solution I finally found to my last major problem. Apologies if it's too verbose; I'll write a TL;DR version at the end. Those who've followed my reports know that this issue has been *very* problematic, and that, a while ago, Yan was able to produce a key and signature that passed the unit tests using the NSS tools and pk1sign. After she did this, I finished implementing the code for the ruleset updater to integrate into the rest of the extension (some testing and development is still under way), which you can at [1]. However, when I got around to testing and debugging the ruleset updater with some new, more appropriate testing data (with version numbers updated to reflect the change to version 5.0 of the extension) and a signature for that data, even though I had followed Yan's method exactly, the signature verification failed. I had tried every combination of inputs to the verifier I thought could make sense depending on where some hashing might be taking place, to no avail. Just tonight I finally did something that seems to have worked! As you can see in the most recent commits to the features/tests branch[2], signature verification seems to pass with the raw update.json file contents, and the signature of those contents, passed to verifyData. I have to admit that I am very surprised and confused that the verification would pass under these circumstances, but not when sha256(update.json) and signature(sha256(update.json)) are passed to the verifier (to use the notation loosely). Specifically, I had been doing the following: 1. Hash update.json's contents with `openssl sha -sha256 -hex update.json` 2. Store the value produced (without the prefixing SHA256(...)=) in update.digest 3. Using certutil and pk1sign, sign update.digest as described in Yan's gist[3] Of course, when I ran my unit tests, I hardcoded the same value stored in update.digest into my tests to make absolutely sure that: 1. I had written update.json into the test source code correctly and 2. update.json's contents were hashing to the same value I had taken the signature of. The TL;DR version I promised of this explanation is this: For some reason which I've tried for days to understand, the nsIDataSignatureVerifier component does not successfully verify the signature of the hash of some data, despite that data being successfully hashed and provided as input to the verifier. However, the verifier passes in the case that the signature is taken over the raw data itself, and said data (not hashed) is provided as the input to the verifier, as you can see in my recent commits[2]. This means that, to sign update.json in preparation for a ruleset release, the contents of update.json itself would have to be typed into the airgapped machine at EFF instead of just a hash of the content. Admittedly, this isn't *that* bad of an outcome, as update.json does not contain a lot of data, and arguably it's easier to transcribe the content of a JSON object than to copy a string of hex. If anyone would like to try to repeat the processes I've been following for themselves and would like some clarification on anything, I'm happy to help. All the best, Zack [1] - https://github.com/redwire/https-everywhere/tree/rulesetUpdating [2] - https://github.com/redwire/https-everywhere/tree/feature/tests [3] - https://gist.github.com/diracdeltas/39d48e315d4ce1a67b83
signature.asc
Description: OpenPGP digital signature
_______________________________________________ HTTPS-Everywhere mailing list [email protected] https://lists.eff.org/mailman/listinfo/https-everywhere
