On 2/12/26 3:19 PM, Dmitry Mityugov via discuss wrote:
> Hi,
> 
> I'm looking into implementing some unit tests for OVS/OVN. By unit
> tests, I mean the foundation of the testing pyramid, that is - a bunch
> of small tests that run against separate functions to guarantee that
> they remain valid after modifications. Please correct me if I'm wrong
> but to my understanding, these tests are not the same as `make check`
> and friends that look more like integration tests to me.

Hi, Dmitry.  'make check' runs a lot of different tests.  A good portion
of them are indeed more like integration tests as they are spinning up
the actual ovsdb-server and ovs-vswitchd to test specific functionality.

However, we do have actual pure unit tests there as well.  They come in
two forms:

1. Small C programs built on the 'ovstest' framework.  You can find them
   by searching for the use of OVSTEST_REGISTER macro.  These are covering
   mostly code for the libraries with well-defined interfaces.

2. Tests like ones in tests/ofp-actions.at, that are leveraging special
   tools like 'ovs-ofctl parse-actions ...' in order to check specific
   functions, but without writing extra C programs.

There are also a bit larger C/python programs that test specific larger
libraries that require external connections to a database server, for
example, or other things that are not possible to do with a pure unit
testing.

All of the above is also true for OVN code.

> In addition,
> it would be nice to gather some coverage statistics from these unit
> tests, to make sure they cover, say, 80 or 90 percent of the code.

We have lcov integration.  See the Documentation/topics/testing.rst.

In general, achieving a high coverage value with just pure unit tests
would be a very hard task, as there is a lot of platform-dependent
code, code that requires network connections or some other external
resources, and simply functions that are not exported (and should not
be exported) and so are hard to get access to from a pure unit test.

Having a lot of unit tests that depend on internal stuff that is not
an external interface or at least a widely used internal one, is most
of a time an unnecessary burden during development and maintenance, as
they need changes every time the code does, which defeats the purpose
of pure unit testing for the most part.

That's why we mostly use pure unit tests for libraries with well-defined
interfaces.

> 
> I'm kindly asking for some suggestions on this subject. Was this
> attempted before? What testing tools might fit well into the OVS/OVN
> build chain?

We have the 'ovstest' framework and the ability to create special test
commands for utilities like ovs-ofctl.  External dependencies are a
liability, so unless they provide tangible benefits, we probably should
not use them.

> Should I stick to C for these tests or C++ might also be
> a good choice?

We use C, Python, some M4 and shell.  The only C++ code we have, IIRC,
is a small program to test compatibility of our public headers.  So,
it's better to stick with C and Python.  For python though, the
dependency argument still stands, we should be mindful of what
dependencies the python code is bringing in and stick with the standard
library whenever possible.

Best regards, Ilya Maximets.
_______________________________________________
discuss mailing list
[email protected]
https://mail.openvswitch.org/mailman/listinfo/ovs-discuss

Reply via email to