I'll quibble over the phrase "the same makefile logic".

I think it is OK to use the same Makefile infrastructure (e.g. the configure.sh mechanism) and the same top level Makefile, but at some level this is going to need to be distinct Makefile logic specific to compiling the code for the tests.

I agree with the general concept of pre-building binaries, but it would be good to see the next level of detail:

-- where is the source code for the native code of the tests
-- is it one library per test, or what
-- what sort of hierarchy do the libraries end up in.

I am also concerned for the developer experience. One of the characteristics of the jtreg design has always been that it is dev-friendly, meaning it is easy and fast for developers to edit a test and rerun it. For myself, I don't work on native code, or on repos containing native code, so I'd be interested to hear how this will impact developers working on tests, and on those folk that simply want to run the tests.

-- Jon


On 04/25/2014 05:02 AM, Staffan Larsen wrote:
There are a couple of jtreg tests today that depend on native components 
(either JNI libraries or executables). These are handled in one of two ways:

1) The binaries are pre-compiled and checked into the repository (often inside 
jar files).
2) The test will try to invoke a compiler (gcc, cl, …) when the test is being 
run.

Neither of these are very good solutions. #1 makes it hard to run the setup the 
test for all platforms and requires binaries in the source control system. #2 
is hit-and-miss: the correct compiler may or may not be installed on the test 
machine, and the approach requires platform specific logic to be maintained.

I would like to propose that these native components are instead compiled when 
the product is built by the same makefile logic as the product. At product 
build time we know we have access to the (correct) compilers and we have 
excellent support in the makefiles for building on all platforms.

If we build the native test components together with the product, we also have 
to take care of distributing the result together with the product when we do 
testing across a larger number of machines. We will also need a way to tell the 
jtreg tests where these pre-built binaries are located.

I suggest that at the end of a distributed build run, the pre-built test 
binaries are packaged in a zip or tar file (just like the product bits) and 
stored next to the product bundles. When we run distributed tests, we need to 
pick up the product bundle and the test bundle before the testing is started.

To tell the tests where the native code is, I would like to add a flag to jtreg 
to point out the path to the binaries. This should cause jtreg to set 
java.library.path before invoking a test and also set a test.* property which 
can be used by test to find it’s native components.

This kind of setup would make it easier to add and maintain tests that have a 
native component. I think this will be especially important as more tests are 
written using jtreg in the hotspot repository.

Thoughts on this? Is the general approach ok? There are lots of details to be 
figured out, but at this stage I would like to hear feedback on the idea as 
such.

Thanks,
/Staffan


Reply via email to