> As I recall we didn't define the U/L bit ; the IEEE did, and we decided
> to invert it. And I thought the U/L bit was only part of the ID allocation
> mechanism more than anything else. It has an unfortunate side effect of
> appearing to add semantics to the identifier field, but I don't think it
> really does so.

Without going into history, the current state 
is that the U/L bit has semantics in the sense that we point out [somewhere]
that e.g. future transport protocols might use the knowledge that
the interface ID is globally unique when the U bit is set.

> Correct, and this is exactly how an identifier field should be
> -opaque, no syntax, no semantics. That is my whole point.

FWIW we seem to, for whatever reasons, already have embarked on the path
of having semantics associated with the interface ID.

While I'm not advocating removing the U/L bit, it sure would be interesting
to know whether we think assigning the globally unique semantics is 
a bad idea. If so perhaps we should make it clear that U=1 means
"assigned by the IEEE according to EUI-64" and nothing more.
My interpretation is that this is not the current state of things.

  Erik

--------------------------------------------------------------------
IETF IPng Working Group Mailing List
IPng Home Page:                      http://playground.sun.com/ipng
FTP archive:                      ftp://playground.sun.com/pub/ipng
Direct all administrative requests to [EMAIL PROTECTED]
--------------------------------------------------------------------

Reply via email to