On Wednesday 12 Dec 2012 16:08:59 Paul Sokolovsky wrote:
> On Wed, 12 Dec 2012 12:18:56 +0000
> Matthew Gretton-Dann <[email protected]> wrote:
> > On Wednesday 12 Dec 2012 13:17:05 Paul Sokolovsky wrote:
> One thing we don't fully understand is "build" concept as used in
> CBuild. For example, in Jenkins it's simple - there's "job" which is
> kind of class and "build" which is kind of instance of job, built
> against particular revision of sources and capturing build logs (so,
> for each job, it's possible to browse all recent builds and their
> results). But with CBuild, it seems to me that it tracks only latest
> build of a job. What happens with previous builds - are they just
> removed, or their build artifacts still available there in some
> distinct directory, just not referenced in web frontend?

For CBuild a Job in the web-interface directly correlates to a .job file in a 
queue (say FOO.job).  So there may be a FOO.job file in many queues.

>From what you describe I think FOO.job relates to a Jenkins build most 
directly.

The Jenkins 'job' concept is builtin to cbuild, and CBuild will take a FOO.job 
and will determine what to do based upon the name of the job (FOO) and some 
flags in FOO.job.

Respins overwrite the previous job's output by default - you have to add some 
magic to the FOO.job file to make the two outputs live side by side.

> > What is the intention here?  Are we going to have Lava be the
> > high-level builder, and so its 'tests' are each of the build steps,
> > and then the testsuite analysis is done elsewhere? This is the current
> > setup (see http://cbuild.validation.linaro.org/helpers/testcompare/gcc-
> > linaro-4.6-2012.09/logs/x86_64-precise-cbuild367-oort3-x86_64r1/gcc-
> > testsuite.txt - except this is currently returning an Internal Server
> > Error). Or is the intention for Lava to handle all of the
> > gcc-testsuite results individually?
> 
> My intention was to explore what result representation LAVA allows,
> and actually to have experience with that. And it's pretty flexible,
> so both of options you mentioned are possible (recording each case
> individually will of course require more work - either add commands
> to record each testcase results to existing makefiles/scripts, or to
> write parser for test output, or both).
> 
> Which exact to use is up to the stakeholders, and but that I mean not
> only TCWG, but also Validation, QA, Release Managers, Platform in
> general, which may be interested in detailed and standardly represented
> test details for toolchain. I see Infrastructure's role as
> ensuring the various options are possible, and providing focused
> documentation/examples how to achieve that. So, that's definitely
> something to discuss, but likely in 13.01, once basic integration is
> complete.

At the moment I do not have a preference - as long as it is clear what the 
current state is (which it is now).

At some point there will be work needed on results presentation layers - but 
whether that happens in LAVA or not is not important today.

Thanks,

Matt

-- 
Matthew Gretton-Dann
Toolchain Working Group, Linaro

_______________________________________________
linaro-validation mailing list
[email protected]
http://lists.linaro.org/mailman/listinfo/linaro-validation

Reply via email to