Hi Mateusz,
Impressive work! Kudos!!!

I followed the steps and it compiled and tested (in fact still running at
4% until now).

The installation steps you wrote are easy to follow and work
"out-of-the-box".

I think it should be important to have some documentation explaining how to
add new test cases, how to add a new board (actually there is no real board
supported yet), what nuttx-nftc is and its goal.Or better yet: what it can
do and what it can't.

Also the "for Community" in the name passes an idea that someone just
created for internal use and decided to give a different version "for
Community", maybe it is better to define a new meaning for the C:  "for
Compatibility", "for Completeness", etc.

While I'm writing this email, I saw these FAILED messages: but not
indication what caused the failure:

external/nuttx-testing/arch/space/test_space_integration.py::test_df PASSED
[  4%]
external/nuttx-testing/arch/timer/test_arch_timer_integration.py::test_timerjitter_integration
FAILED [  4%]
external/nuttx-testing/driver/framebuffer/test_driver_framebuffer_integration.py::TestDriverFramebuffer::test_driver_framebuffer_black
FAILED [  4%]
external/nuttx-testing/driver/framebuffer/test_driver_framebuffer_integration.py::TestDriverFramebuffer::test_driver_framebuffer_white

Is there some way to improve the test speed? It is running for more than
40min and still at 4%. Some tests like these that failed are taking more
than 7min to run, my laptop is not a slow machine: Dell XPS i7-11800H. So I
think on an old PC it will be even slower. Maybe it could be something in
my Ubuntu setup, did you notice sometime similar in your setup? How much
time does it require to finish the test on SIM? (I didn't test qemu, but I
support it will be slower)

I noticed that during the test "python" is using 100% of the CPU (sometimes
it drops to 99,7%).

These things are minor issues, I think you create something really powerful
that will help to improve the NuttX quality to some level we never saw
before.

BR,

Alan

On Sun, Nov 2, 2025 at 2:51 PM raiden00pl <[email protected]> wrote:

> Hi,
> here are the repositories if anyone would like to give them a try:
>
> - Test runner: https://github.com/szafonimateusz-mi/nuttx-ntfc
> - Test cases: https://github.com/szafonimateusz-mi/nuttx-testing
>
> A quick guide on how to run the tests can be found here:
>
> https://github.com/szafonimateusz-mi/nuttx-vtfc/blob/main/docs/quickstart.rst
>
> The easiest way to get started on a Linux host is by using a simulator or
> qemu-intel64.
> I'll add examples for other QEMU targets later.
>
> There's still a lot of work to be done, but it's initially working and
> presents
> the general idea of the tool. Any feedback or ideas are very welcome :)
>
> czw., 23 paź 2025 o 13:33 Filipe Cavalcanti
> <[email protected]> napisał(a):
>
> > This seems very good.
> >
> > We also have a lot of testing internally on Espressif and we would be
> > willing to share them. Many of them simply cover basic use of defconfigs
> > and some cover more complex functionality such as MCUBoot, flash
> > encryption, file system, etc.
> >
> > Also, we use pytest-embedded plugin for all of our tests, which allows us
> > to communicate with serial devices and even QEMU. I have create the
> > pytest-embedded-nuttx plugin that adds support for NuttX by parsing the
> > NuttShell (either serial or QEMU).
> >
> > I think it is important we have the simulator and QEMU support since
> those
> > don't need hardware, but we should have form of adding outside runners so
> > we can test on actual devices.
> >
> > Looking forward to see the Python package and the test cases!
> > ________________________________
> > From: Nathan Hartman <[email protected]>
> > Sent: Wednesday, October 22, 2025 10:55 PM
> > To: [email protected] <[email protected]>
> > Subject: Re: Automated Testing Framework for NuttX
> >
> > [External: This email originated outside Espressif]
> >
> > Anything we can do to improve the testing coverage of NuttX is a good
> > thing. I am not familiar with pytest etc but conceptually the description
> > sounds good. It also sounds like the test cases can be extended over
> time.
> >
> > Creating additional repositories is easy. The community just needs to
> > decide what repositories it needs and they can be created.
> >
> > Agreed we need a cool name for this! I haven't thought of a good one yet
> > but let's solicit ideas from the community!
> >
> > Cheers,
> > Nathan
> >
> > On Wed, Oct 22, 2025 at 8:24 AM raiden00pl <[email protected]> wrote:
> >
> > > Hi everyone,
> > >
> > > Xiaomi would like to contribute to the community an automated testing
> > tool
> > > I recently wrote, along with the dedicated NuttX test cases they use in
> > > their CI.
> > >
> > > The main idea is to provide a universal tool for automated testing that
> > can
> > > be
> > > used by the NuttX project and its users. It could be used in upstream
> CI
> > as
> > > well as
> > > in the NXDART project. It’s a similar concept to
> > > `nuttx/tools/ci/testrun`, but
> > > offers more advanced functionality.
> > >
> > > The tool is a standalone Python module that runs pytest underneath and
> > > extends its
> > > functionality with custom pytest plugins. This way, we can completely
> > > separate the test
> > > cases from the logic that executes them which is not standard pytest
> > usage.
> > > This approach provides more control over what the tool can do, but
> > > integrating with
> > > pytest requires some tricks.
> > >
> > > Test cases are written as regular pytest tests. In short, they execute
> > > commands
> > > on a NuttX target and check the returned data. Currently, SIM and QEMU
> > > targets
> > > are supported. In the future, I plan to add serial port support so that
> > > real
> > > hardware testing can be done. Ideally, the tool would handle the entire
> > > process
> > > of building a NuttX image, flashing it onto hardware, and running tests
> > > (like Twister
> > > for Zephyr).
> > >
> > > The test cases are based on programs available in nuttx-apps (some
> tools
> > > aren’t
> > > upstream yet). Test cases from Xiaomi that don’t fit well with NuttX
> > > upstream
> > > will go into the OpenVela project. Users and companies interested in
> > NuttX
> > > will
> > > be able to create their own test packages and use the same tool to run
> > and
> > > manage
> > > them. I think this aligns quite well with the idea of distributed tests
> > for
> > > Nuttx (NXDART).
> > >
> > > The tool can generate reports and save them locally, including test
> > results
> > > and
> > > console logs for easier debugging. In the future, we could add options
> to
> > > compare
> > > various OS metrics between tests (image size, performance, etc.).
> > >
> > > The tool is already functional and can replace the current CI tool
> > > `nuttx/tools/ci/testrun`.
> > > More features will be added over time, I have a lot of ideas here.
> > >
> > > After this short introduction, I have a few questions for the
> community:
> > >
> > > 1. Is the community interested in adopting this new testing tool and
> test
> > > cases for
> > >     the NuttX? Xiaomi has agreed to donate the code to the project
> > >
> > > 2. Inside Xiaomi, the project is called VTFC (Vela Testing Framework
> for
> > > Community).
> > >    The name VTFC is a bit cryptic and refers specifically to Vela
> > > but Xiaomi is open
> > >    to renaming it. Some alternative suggestions are NTFC (NuttX Testing
> > > Framework
> > >    for Community) or NTF (NuttX Testing Framework).
> > >    If anyone has better ideas, please let us know. As we all know, a
> cool
> > > project
> > >    name is important :)
> > >
> > > 3. I think we’d need two separate repositories for this:
> > >
> > >    - one for the testing tool that runs the test cases and does the
> rest
> > of
> > > the magic (VTFC),
> > >    - and another for test cases dedicated to NuttX upstream.
> > >
> > >    For the second one, we could use
> > > https://github.com/apache/nuttx-testing,
> > > which
> > >    is currently empty, but we’d still need one more repo.
> > >
> > >    Alternatively, we could use https://github.com/Nuttx organization
> > which
> > > would make
> > >    repository management easier (no Apache infra) but then the code
> will
> > > not
> > >    officially be under Apache and I don't know what consequences it
> has.
> > >
> > >    Separation of the test tool and test cases from the kernel code is
> > > important
> > >    because it allows for better QA for the Python code and automation.
> > > There will be
> > >    quite a lot of Python coding involved, and mixing it with kernel
> code
> > > doesn’t feel good.
> > >
> > > Let me know what you think. Any suggestions are welcome :)
> > >
> > > Regards
> > >
> >
>

Reply via email to