Hi Acee, > On Feb 4, 2025, at 1:35 PM, Acee Lindem <[email protected]> wrote: > > Speaking as document shepherd and LSR Co-chair: > > Hi Mahesh, > > >> On Feb 4, 2025, at 3:48 PM, Mahesh Jethanandani via Datatracker >> <[email protected]> wrote: >> >> Mahesh Jethanandani has entered the following ballot position for >> draft-ietf-lsr-flex-algo-bw-con-18: Discuss >> >> When responding, please keep the subject line intact and reply to all >> email addresses included in the To and CC lines. (Feel free to cut this >> introductory paragraph, however.) >> >> >> Please refer to >> https://www.ietf.org/about/groups/iesg/statements/handling-ballot-positions/ >> for more information about how to handle DISCUSS and COMMENT positions. >> >> >> The document, along with other ballot positions, can be found here: >> https://datatracker.ietf.org/doc/draft-ietf-lsr-flex-algo-bw-con/ >> >> >> >> ---------------------------------------------------------------------- >> DISCUSS: >> ---------------------------------------------------------------------- >> >> Section 9, paragraph 0 >>> Operational consideration defined in [RFC9350] generally apply to the >>> extensions defined in this document as well. This document defines >>> metric-type range for user defined metrics. When user defined >>> metrics are used in an inter-area or inter-level network, all the >>> domains should assign same meaning to the particular metric-type. >> >> The Operational Consideration in this document refers to Operational >> Consideration in [RFC9350] which mentions that operators can configure the >> FAD, >> but does not mention how. In other words, is there a YANG model defined to >> configure this feature? If not, why not? > > Ostensibly, we have flex-algorithm augmentations in: > > https://datatracker.ietf.org/doc/draft-ietf-lsr-isis-yang-augmentation-v1/ > <https://datatracker.ietf.org/doc/draft-ietf-lsr-isis-yang-augmentation-v1/> > https://datatracker.ietf.org/doc/draft-ietf-lsr-ospf-yang-augmentation-v1/ > <https://datatracker.ietf.org/doc/draft-ietf-lsr-ospf-yang-augmentation-v1/> > > Since flex-algo is becoming a significant area of LSR extension, the > co-authors of the above will discuss splitting flex-algo into separate drafts. > > What we really need is the the YANG module versioning to conclude and be > implemented so that extensions are less onerous. But that is a separate > discussion.
If these are extensions, can they not be implemented as augmentations of existing modules (in a separate draft)? Either way, a mention of the fact that an extension (augmentation in YANG terms) of an existing module is needed to manage the feature would suffice. Thanks. > > > >> >> >> ---------------------------------------------------------------------- >> COMMENT: >> ---------------------------------------------------------------------- >> >> The document has six authors, which exceeds the recommended author limit. Has >> the sponsoring AD agreed that this is appropriate. > > I had this discussion with the co-authors (especially given that there were > four authors from one vendor) and all the co-authors were involved in the > draft and implementation. > > If we are going to start gating IGP extensions to standardization of YANG > model extensions, we are going to need to allow many more authors. > > Thanks, > Acee Mahesh Jethanandani [email protected]
_______________________________________________ Lsr mailing list -- [email protected] To unsubscribe send an email to [email protected]
