On 13.02.26 01:33, Joshua wrote:

I keep hearing this “there are communities that want pure PQ”, but I’ve yet to hear a compelling reason for this that doesn’t involve embedded devices where code size is a constraint (mentioned in passing in the latest draft).
Probably relates to "less computation", but then:

1. Not all embedded devices are constrained. Just want to emphasize
   that it's even narrow set of /constrained embedded devices/ than
   embedded devices in general.
2. Are there some concrete stats on how much code size we are talking
   about, particularly:
     * usage of flash memory
     * usage of RAM

While I respect the contents of the draft as probably secure, I think we need to acknowledge the duplication and unnecessary risk we are introducing alongside the universally respected hybrid suites. Is there a customer that can provide a compelling reason as to why a hybrid construction degrades the security of their product? Is there any compelling reason at all /against/ hybridization?
I am also desperately seeking answers to those questions.

Andrei states:

> Private sector SW vendors need to comply with government rulemaking, at least if they hope to sell products and services to the government. Also, certain private sector organizations tend to adopt government guidelines for their own operations.

If the TLS WG standardized every government guideline in order to enable private sector vendors, then there would be far too much noise.
Agree and I shared this concern in last WGLC [0].
Additionally, I’d like to point out a compelling case against adopting NIST requirements without further scrutiny: Dual-EC-DRBG. Anyone with a pair of eyes could see that the lack of truncation and the use of constant curve points rather than a Hash-To-Curve algorithm (or even hashing to a point, as is the case with NIST curves) indicated that someone knew the discrete logarithm of P to Q. It could only have been implemented by Microsoft, RSA, Cisco, and other large companies because there was no scrutiny. I find it particularly disheartening to see—once again—a lack of scrutiny towards the selection of secure defaults for worldwide adoption.

I raised similar concerns in last WGLC [0]:

   I believe TLS WG should be carefully analyzing and defining what is
   secure use of PQ in TLS rather than being forced to support
   potentially weak solutions by the "regulatory requirements".

Nothing much seems to have changed since the last WGLC. As Stephen rightly points out, all concerns are almost "replay" of last WGLC. What am I missing? Do we just want to play a "please replay your concerns" game for 2 weeks?

If anything, facts have shown that the situation is even worse than the last WGLC since it has become evident that there are no concrete "regulatory requirements".

Also, I would add that it would be better if some PQ experts in the WG carry out the above mentioned formal analysis, because I am in very beginning phases of PQ and I don't see any urgency on ML-KEM.

-Usama

[0] https://mailarchive.ietf.org/arch/msg/tls/Yul6hw0gD-48n4CjCOthafKZ7Rc/

Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

_______________________________________________
TLS mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to