I went to the Ottawa Linux Symposium and heard several good talks on
testing and performance.  One of the outstanding ones (to me) was the
University of Waterloo and their testing apparatus and harness for doing
testing for performance and the application of statistical analysis to the
testing, as a one-time test may not show the true results.

They have a mechanism, and a lab, that allows people to build code for
testing, run the tests with supplied data, and then statistically analyse
the data to see.

It has a front end that allows you to supply the information necessary to
build your project, then distributes it to a number of different machines
to run for a selected number of times.  Even though you may be running the
same data through, they can show that various other factors will vary
performance as much as 15% on a given run of the test data.

The project URL is here:

https://uwaterloo.ca/embedded-software-group/datamill

I asked them for a copy of their slides, which I have uploaded to a Google
Drive and will make a link available to anyone who wishes to see them.

My thoughts on this are that Linaro might want to study what they have done
and build this type of analysis into Lava, or perhaps the University of
Waterloo would like to incorporate parts of Lava into their back end, or
the two groups could work together.  I have sent them pointers to Lava web
pages from the Linaro site.

At a minimum, we might find that making machines attached to a Lava
instance available to the University's DataMill program (perhaps through a
gateway) would increase the amount of visibility and testing available to
Linaro's customer systems.

Warmest regard,

maddog
_______________________________________________
linaro-validation mailing list
linaro-validation@lists.linaro.org
http://lists.linaro.org/mailman/listinfo/linaro-validation

Reply via email to