Andy Doan <[email protected]> writes:
> On 11/15/2012 08:37 PM, Michael Hudson-Doyle wrote:
>> I was recently making some additions to the lava-test-shell code and in
>> the process more or less had to write some documentation on how things
>> work. Here is what I wrote (actually what I wrote includes stuff about
>> test result attachments but that's not in trunk yet):
>>
>
> [snip]
> I'm embarrassed for not doing that in my original commit. Thanks.
>
>> But the tweaks I want to propose are more to do with what goes into the
>> /lava/bin and /lava/results directories. For a start, I don't think
>> it's especially useful that the tests run with lava-test-runner or
>> lava-test-shell on $PATH -- there's no reason for a test author to call
>
> its mostly harmless. I didn't intended developers to use those two
> script. I just thought it might make it easier to not have to call out
> the full path to those in our own code.
Yeah true. I should probably just not worry about it!
>> either of those functions! However I want to add helpers -- called
>> lava-test-case and lava-test-case-attach in my brain currently:
>>
>> * lava-test-case will send the test case started signal, run a shell
>> command, interpret its exit code as a test result and send the test
>> case stopped signal
>> * lava-test-case-attach arranges to attach a file to the test result
>> for a particular test case id
>>
>> So you could imagine some run steps for an audio capture test:
>>
>> run:
>> steps:
>> - lava-fft -t 3 > generated.wav
>> - lava-test-case-attach sine-wave generated.wav audio/wav
>> - lava-test-case sine-wave aplay generated.wav
>>
>> and appropriate hooks on the host side:
>>
>> * test case start/stop hooks that would capture audio
>> * an "analysis hook" that would compare the captured sample with the
>> attached wav file (and attach the captured wav)
>
> +1 - and I liked your branch that does the attachment logic.
Cool.
>> Semi-relatedly, I would like to (at least optionally) store the test
>> result data more explicitly in the
>> /lava/results/${IDX}_${TEST_ID}-${TIMESTAMP} directory. Maybe something
>> like this (in the above style):
>>
>> # /lava/
>> # results/
>> # ... As before
>> # ${IDX}_${TEST_ID}-${TIMESTAMP}/
>> # ... All the stuff we had before.
>> # cases/
>> # ${TEST_CASE_ID}/
>> # result This would contain pass/fail/skip/unknown
>> # units Mb/s, V, W, Hz, whatever
>> # measurement Self explanatory I expect.
>> # attachments/
>> # ${FILENAME} The content of the attachment
>> # ${FILENAME}.mimetype The mime-type for the attachment
>> # attributes/
>> # ${KEY} The content of the file would be the
>> # value of the attachment.
>> # ... other things you can stick on test results ...
>>
>> Basically this would be defining another representation for test
>> results: on the file system, in addition to the existing two: as JSON or
>> in a postgres DB.
>>
>> The reason for doing this is twofold: 1) it's more amenable than JSON to
>> being incrementally built up by a bunch of shell scripts as a
>> lava-test-shell test runs and 2) this directory could be presented to an
>> "analysis hook" written in shell (earlier on today I told Andy Doan that
>> I though writing hooks in shell would be impractical; now I'm not so
>> sure). Also: 3) (noone expects the spanish inquisition!) it would allow
>> us to write a lava-test-shell test that does not depend on parsing
>> stdout.log.
>
> This sounds good, but I worry how it plays out. Could you elaborate a
> little on how you think a person would write such a test?
OK, how about
http://people.linaro.org/~mwh/lava-test-case-doc/lava_test_shell.html#writing-a-test-for-lava-test-shell
? This is built from a branch I've been working on today (the name is
probably not quite right any more):
https://code.launchpad.net/~mwhudson/lava-dispatcher/more-obvious-json-disk-bundle-equivalence
The idea (likely obvious to you, but probably not to anyone else) is
that lava-test-case --shell will send start/stop test case signals to
the host. I'd hoped to get a version of that done today, but it's
getting a bit late now...
> ie - it feels like we are on the path to becoming not only a test
> harness but also a test framework. I guess with the implication of
> signals and such, we have to become more of a framework, so I my worry
> might be unavoidable.
Yes, I think so. We can't rely on parsing output after the fact to
signal to the host when a test case starts and stops...
Cheers,
mwh
_______________________________________________
linaro-validation mailing list
[email protected]
http://lists.linaro.org/mailman/listinfo/linaro-validation