Dear all,

I've been spending much time the last days, running my scripts multiple times per day, and informing authors about errors/warnings from my reports <http://claise.be/IETFYANGPageCompilation.html>. Let me hope it made a difference <http://claise.be/IETFYANGPageCompilation.png>!

There are multiple tools out there, which might produce different results.
And that generated some questions/concerns recently: why do I get this error on the yangvalidator.org? I didn't get any warnings from the idnits? How come that you have different errors in your reports?

So let me explain the situation.

1. We added pyang to the submission tool during one of the previous IETF hackathons.
Note that pyang doesn't check xpath
Yes, we might have some caching issue here when people submit multiple interdependent drafts the same day, typically the last day before the deadline. Reason: when a new draft is posted with a YANG module, it validates the new YANG module based on the YANG modules it knows from the previous day.

2. yangvalidator.org is a best effort tool maintained by Carl Moberg.
And yes, we could have some caching issue here too. Same reason as above.
However, you can load multiple drafts or YANG modules for validation.
Radek will be adding yanglint to the yangvalidator.org during this coming hackathon. Note that yanglint validates xpath.

3. The right way to do it is:
    - start from scratch,
    - download all the drafts,
- extract all YANG models (and correct the extraction issues/warn the authors) - include all the RFC YANG modules, and the other YANG modules (IEEE, openconfig, etc.) that might be needed - validate all these YANG modules with 4 validators (2 opensource, 2 commercial) that you obviously need to keep up to date
    - report bugs to the different validators.
    - keep the validators up to date
- report issues to the authors to fix their YANG modules while keeping in mind the different validator issue (ex: this output warning from validator X is a bug and will be solved soon, so don't pay attention to it)
    - wash ... rinse ... repeat
All this (I skipped some steps to shorten the email ) is what I do in with my own tool chain to generate my reports (http://www.claise.be/2016/07/ietf-yang-modules-statistiques/), for IETF drafts and github repo. When I see how much time that takes, no wonder that 2 above might not be fully up to date

I hope this explains the situation, and that you get a good service for what you pay for. :-)

See you at the next hackathon to improve those tools.

Regards, Benoit

_______________________________________________
netmod mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/netmod

Reply via email to