On Tue, 27 Aug 2013 07:41:38 -0300
Antonio Terceiro <antonio.terce...@linaro.org> wrote:

> n Tue, Aug 27, 2013 at 09:37:56AM +0300, Fathi Boudra wrote:
> > On 26 August 2013 21:32, Antonio Terceiro
> > <antonio.terce...@linaro.org> wrote:
> > > On Mon, Aug 26, 2013 at 06:22:22PM +0100, Milosz Wasilewski wrote:
> > >> Hi,
> > >>
> > >> Is it possible in LAVA to force reboot between installing test
> > >> case dependencies and running the test itself? I'm trying to run
> > >> bootchart test, but the bootchart package is not included in the
> > >> ubuntu image I'm running.
> > >
> > > That's not possible. You can workaround this by having two
> > > lava_test_shell actions, one that installs the dependency, and
> > > one that actually runs the tests. It sounds horrible (and it
> > > probably is), but will do the job because the target is rebooted
> > > before each lava_test_shell action.
> > >
> > > IMO the right fix for the problem is having bootchart already
> > > installed in the image though.
> > 
> > it has been considered but I'm not convinced it's the right fix.
> > It's the right workaround ;)

I don't see it as a workaround. Having to install / build a program in
the test image isn't ideal - it may be necessary to add the test
support to an existing image or a previous image or a customer image.
By installing or building the test support inside the image, the
checksum of the image at the start of the test can be used to reliably
indicate a valid test of that image, not a hand-modified version of
that image.

> > imo, the real problem: a test is considered finished when we reboot
> > the image. 

Just to add to Antonio's comment - the test is considered complete but
the results are not collected until *all* tests are complete for that
job.

Installing the test support is a separate test, it deserves a separate
result in the final bundle - the result of the installation / build. So
the first test is to do that to the supplied image. That is one set of
results in the bundle. The test then reboots (because of the second
lava_test_shell block) and runs the second test. The final results
bundle contains all the test results, separating the setup of the test
support from the operation of the test itself.

There is one result bundle, it contains multiple sets of results and
is only created once all tests are complete and it is preserved
between reboots.

(This does need to be documented more clearly too.)

A reboot has to be one of the valid ways to end a test case - there is
too much state within typical test cases about which LAVA knows nothing
at all.

It is up to the test writer to put the reboots in the right places.
What we do need to do is document that this is done by having more than
one lava_test_shell blocks in the JSON instead of trying to do the
reboot in the YAML.

Extend this to a use case where the test support needs to be compiled
instead of installed and it is clear to me that the results need to
distinguish between a failure to build the test support prior to the
reboot from the failure of the test operation after the reboot. i.e.
each operation is a different test with a different test result in the
one result bundle. In this use case, I can also see a third result
which is needed - the result of first installing the build dependencies
of the thing to be compiled, the first result of the three. The bundle
would have dep_install, compile & run results.

> > bootchart is just a use-case. it needs to set up the OS, reboot to
> > collect data, finally test results are available.

That is just what LAVA does with two lava_test_shell blocks.
 
> Maybe my solution with two test definitions is the right way to do it
> after all. Ideally we could also have a separate action to just
> install packages in the test image, but I think the overhead of using
> a separate lava-test-shell test definition is low enough.

That can just as easily be a single YAML file.

-- 


Neil Williams
=============
http://www.linux.codehelp.co.uk/

Attachment: signature.asc
Description: PGP signature

_______________________________________________
linaro-validation mailing list
linaro-validation@lists.linaro.org
http://lists.linaro.org/mailman/listinfo/linaro-validation

Reply via email to