Simon Josefsson wrote:
>>   It is widely deployed today with TTLS.  I think that allowing this
>> practice to continue is a requirement.
> 
> I agree, but that does not necessarily mean that
> passwords-sent-over-the-wire and passwords-sent-hashed must have the
> same internationalization treatment or considerations.

  That isn't a requirement, but it does make things easier.

> Sure, but see section 4 of RFC 5198.  If it happens that NFC backwards
> compatibility is broken, you end up with the interop problem.

  ? Section 4 says explicitly:

   if a string does not contain any unassigned
   characters, and it is normalized according to NFC, it will always be
   normalized according to all future versions of the Unicode Standard.

  So there is no backwards compatibility problem.

  My interpretation is that every system taking user input needs to
perform this normalization.  Once that's done, string comparison is
essentially memcmp().

  If both user && authenticator are using the same version of Unicode,
then they inter-operate.

  If one is using a newer version, then it either

        (a) creates the same output string, because the *old* characters
            have not been re-assigned in the *new* standard

        (b) creates a different output string than the old system,
            because either the standard is not backwards compatible,
            or the implementation is wrong.

  Is there anything I'm missing?

  Alan DeKok.
_______________________________________________
Emu mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/emu

Reply via email to