Hi John,Thank you for your valuable feedback. That addresses my concern. I am happy to swing back to 'no opinion' for this WGLC.
I will do further working as per your feedback and share with the WG when I have something of substance.
On 06.05.26 16:46, John Mattsson wrote:
>I have a formal proof for that for TLS. Please clarify what you see wrong in my proof.No, and I sincerely apologize for the misunderstanding. By formal proof, I meant at the /symbolic/ proof (in ProVerif) and not a /computational/ (cryptographic) proof. I believe the cases you mention need cryptographic analysis. I am not sure how to model firewalls though.Did your proof consider known past and current attacks on deployed systems that rely on malleable signatures or signatures lacking beyond-unforgeability properties, including firewalls that use certificate fingerprints in blocklists?
I think your PR would have been less controversial if it had not included a reference to very weak hybrid constructions. Given your usual emphasis on strong security properties, I was somewhat surprised by that direction. ML-DSA is an excellent algorithm in this context. In addition to being non-malleable, which should be a baseline requirement, NIST also significantly strengthened Dilithium with hedged signing and beyond unforgeability (BUFF) properties.
Thank you for your suggestions.
I agree with you regarding EU requirements. However, my reading was that most of them currently need hybrid authentication. Once adopted, WG could have shaped the guidance. My general point was that there has been very little technical discussion recently and we have all been going in circles. If it were all written up in an adopted draft, we could all improve it and avoid circular discussions.>It increasingly feels to me that if we had adopted Stephen's draft [1] >and focused even a small fraction of the energy we have spent on debates >on ML-DSA and ML-KEM, we would have been far better.Stephen’s draft suggested not doing any work on PQC signatures, which would have been risky and is not aligned with the EU PQC Roadmap, which identify PQC as a high priority for trust anchors in long-lived devices and PKI.
>The broader IETF consensus is captured in [0]There was IETF consensus to publish [0]. However, I do not think there is neither IETF nor LAMPS consensus that [0] is on par with RFC 9881 or RFC 9909. [0] inherits the malleability weakness of ECDSA, introduces a new independent malleability weakness, and destroys the beyond-unforgeability (BUFF) properties provided by ML-DSA. Compared to standalone ML-DSA, I believe [0] has a higher likelihood of introducing serious vulnerabilities than of mitigating them. In addition, the legal uncertainty around the use of composite constructions is high. I also recently noted that designing new cryptographic algorithms such as [0], without CFRG vetting, is out of scope for the LAMPS charter. If there were a new WGLC, I would oppose publication.Note that a main driver for [0], SecP256r1MLKEM768, and SecP384r1MLKEM1024 is the ability to sell FIPS-validated implementations of P-256, P-384, and RSA-PKCS#1 v1.5 as “quantum-resistant”, even though only the quantum-vulnerable components are FIPS certified.Ericsson just posted a long public comment [4] on signature security properties, government recommendations, and different approaches to signature hybridisation.[4] https://emanjon.github.io/NIST-comments/2026%20SP%20800-230%20IPD.pdf
Thank you for this useful reference, I had a quick look and this already answers some of my questions. I will explore it in more detail.
Kind regards, -Usama
[0] https://www.ietf.org/archive/id/draft-ietf-lamps-pq-composite-sigs-19.html#section-9.1[1] https://datatracker.ietf.org/doc/draft-farrell-tls-pqg/ [2] https://mailarchive.ietf.org/arch/msg/tls/vIGryOB0TU_vD81HUUxXQUNdnN0/
smime.p7s
Description: S/MIME Cryptographic Signature
_______________________________________________ TLS mailing list -- [email protected] To unsubscribe send an email to [email protected]
