Re: making a hadoop-common test run if a property is set

2012-12-18 Thread Colin McCabe
On Mon, Dec 17, 2012 at 11:03 AM, Steve Loughran ste...@hortonworks.com wrote:
 On 17 December 2012 16:06, Tom White t...@cloudera.com wrote:

 There are some tests like the S3 tests that end with Test (e.g.
 Jets3tNativeS3FileSystemContractTest) - unlike normal tests which
 start with Test. Only those that start with Test are run
 automatically (see the surefire configuration in
 hadoop-project/pom.xml). You have to run the others manually with mvn
 test -Dtest=

 The mechanism that Colin describes is probably better though, since
 the environment-specific tests can be run as a part of a full test run
 by Jenkins if configured appropriately.


 I'd like that -though one problem with the current system is that you need
 to get the s3 (and soon: openstack) credentials into
 src/test/resources/core-site.xml, which isn't the right approach. If we
 could get them into properties files things would be easier.
 That's overkill for adding a few more openstack tests -but I would like to
 make it easier to turn those and the rackspace ones without sticking my
 secrets into an XML file under SCM

I think the way to go is to have one XML file include another.

?xml version=1.0?
?xml-stylesheet type=text/xsl href=configuration.xsl?
configuration xmlns:xi=http://www.w3.org/2001/XInclude;
  property
 nameboring.config.1/name
 valueboring-value/value
 ... etc, etc...
  xi:include href=../secret-stuff.xml /
/configuration

That way, you can keep the boring configuration under version control,
and still have your password sitting in a small separate
non-version-controlled XML file.

We use this trick a bunch with the HA configuration stuff-- 99% of the
configuration is the same between the Active and Standby Namenodes,
but you can't give them the same dfs.ha.namenode.id or dfs.name.dir.
Includes help a lot here.

 another tactic could be to have specific test projects: test-s3,
 test-openstack, test-... which contain nothing but test cases. You'd set
 jenkins up those test projects too -the reason for having the separate
 names is to make it blatantly clear which tests you've not run

I dunno.  Every time a project puts unit or system tests into a
separate project, the developers never run them.  I've seen it happen
enough times that I think I can call it an anti-pattern by now.  I
like having tests alongside the code-- to the maximum extent that is
possible.

cheers,
Colin




 Tom

 On Mon, Dec 17, 2012 at 10:06 AM, Steve Loughran ste...@hortonworks.com
 wrote:
  thanks, I'l; have a look. I've always wanted to add the notion of skipped
  to test runs -all the way through to the XML and generated reports, but
  you'd have to do a new junit runner for this and tweak the reporting
 code.
  Which, if it involved going near maven source, is not something I am
  prepared to do
 
  On 14 December 2012 18:57, Colin McCabe cmcc...@alumni.cmu.edu wrote:
 
  One approach we've taken in the past is making the junit test skip
  itself when some precondition is not true.  Then, we often create a
  property which people can use to cause the skipped tests to become a
  hard error.
 
  For example, all the tests that rely on libhadoop start with these
 lines:
 
   @Test
   public void myTest() {
  Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
 ...
   }
 
  This causes them to be silently skipped when libhadoop.so is not
  available or loaded (perhaps because it hasn't been built.)
 
  However, if you want to cause this to be a hard error, you simply run
   mvn test -Drequire.test.libhadoop
 
  See TestHdfsNativeCodeLoader.java to see how this is implemented.
 
  The main idea is that your Jenkins build slaves use all the -Drequire
  lines, but people running tests locally are not inconvenienced by the
  need to build libhadoop.so in every case.  This is especially good
  because libhadoop.so isn't known to build on certain platforms like
  AIX, etc.  It seems to be a good tradeoff so far.  I imagine that s3
  could do something similar.
 
  cheers,
  Colin
 
 
  On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran ste...@hortonworks.com
 
  wrote:
   The swiftfs tests need only to run if there's a target filesystem;
  copying
   the s3/s3n tests, something like
  
 property
   nametest.fs.swift.name/name
   valueswift://your-object-store-herel//value
 /property
  
   How does one actually go about making junit tests optional in
 mvn-land?
   Should the probe/skip logic be in the code -which can make people
 think
  the
   test passed when it didn't actually run? Or can I turn it on/off in
  maven?
  
   -steve
 



Re: making a hadoop-common test run if a property is set

2012-12-18 Thread Colin McCabe
On Tue, Dec 18, 2012 at 1:05 AM, Colin McCabe cmcc...@alumni.cmu.edu wrote:
 On Mon, Dec 17, 2012 at 11:03 AM, Steve Loughran ste...@hortonworks.com 
 wrote:
 On 17 December 2012 16:06, Tom White t...@cloudera.com wrote:

 There are some tests like the S3 tests that end with Test (e.g.
 Jets3tNativeS3FileSystemContractTest) - unlike normal tests which
 start with Test. Only those that start with Test are run
 automatically (see the surefire configuration in
 hadoop-project/pom.xml). You have to run the others manually with mvn
 test -Dtest=

 The mechanism that Colin describes is probably better though, since
 the environment-specific tests can be run as a part of a full test run
 by Jenkins if configured appropriately.


 I'd like that -though one problem with the current system is that you need
 to get the s3 (and soon: openstack) credentials into
 src/test/resources/core-site.xml, which isn't the right approach. If we
 could get them into properties files things would be easier.
 That's overkill for adding a few more openstack tests -but I would like to
 make it easier to turn those and the rackspace ones without sticking my
 secrets into an XML file under SCM

 I think the way to go is to have one XML file include another.

 ?xml version=1.0?
 ?xml-stylesheet type=text/xsl href=configuration.xsl?
 configuration xmlns:xi=http://www.w3.org/2001/XInclude;
   property
  nameboring.config.1/name
  valueboring-value/value
  ... etc, etc...
   xi:include href=../secret-stuff.xml /
 /configuration

 That way, you can keep the boring configuration under version control,
 and still have your password sitting in a small separate
 non-version-controlled XML file.

 We use this trick a bunch with the HA configuration stuff-- 99% of the
 configuration is the same between the Active and Standby Namenodes,
 but you can't give them the same dfs.ha.namenode.id or dfs.name.dir.
 Includes help a lot here.

 another tactic could be to have specific test projects: test-s3,
 test-openstack, test-... which contain nothing but test cases. You'd set
 jenkins up those test projects too -the reason for having the separate
 names is to make it blatantly clear which tests you've not run

 I dunno.  Every time a project puts unit or system tests into a
 separate project, the developers never run them.  I've seen it happen
 enough times that I think I can call it an anti-pattern by now.  I
 like having tests alongside the code-- to the maximum extent that is
 possible.

Just to be clear, I'm not referring to any Hadoop-related project
here, just certain other open source (and not) ones I've worked on.
System/unit tests belong with the rest of the code, otherwise they get
stale real fast.

It sometimes makes sense for integration tests to live in a separate
repo, since by their nature they're usually talking to stuff that
lives in multiple repos.

best,
Colin


 cheers,
 Colin




 Tom

 On Mon, Dec 17, 2012 at 10:06 AM, Steve Loughran ste...@hortonworks.com
 wrote:
  thanks, I'l; have a look. I've always wanted to add the notion of skipped
  to test runs -all the way through to the XML and generated reports, but
  you'd have to do a new junit runner for this and tweak the reporting
 code.
  Which, if it involved going near maven source, is not something I am
  prepared to do
 
  On 14 December 2012 18:57, Colin McCabe cmcc...@alumni.cmu.edu wrote:
 
  One approach we've taken in the past is making the junit test skip
  itself when some precondition is not true.  Then, we often create a
  property which people can use to cause the skipped tests to become a
  hard error.
 
  For example, all the tests that rely on libhadoop start with these
 lines:
 
   @Test
   public void myTest() {
  Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
 ...
   }
 
  This causes them to be silently skipped when libhadoop.so is not
  available or loaded (perhaps because it hasn't been built.)
 
  However, if you want to cause this to be a hard error, you simply run
   mvn test -Drequire.test.libhadoop
 
  See TestHdfsNativeCodeLoader.java to see how this is implemented.
 
  The main idea is that your Jenkins build slaves use all the -Drequire
  lines, but people running tests locally are not inconvenienced by the
  need to build libhadoop.so in every case.  This is especially good
  because libhadoop.so isn't known to build on certain platforms like
  AIX, etc.  It seems to be a good tradeoff so far.  I imagine that s3
  could do something similar.
 
  cheers,
  Colin
 
 
  On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran ste...@hortonworks.com
 
  wrote:
   The swiftfs tests need only to run if there's a target filesystem;
  copying
   the s3/s3n tests, something like
  
 property
   nametest.fs.swift.name/name
   valueswift://your-object-store-herel//value
 /property
  
   How does one actually go about making junit tests optional in
 mvn-land?
   Should the probe/skip logic be in the code -which can make 

Re: making a hadoop-common test run if a property is set

2012-12-18 Thread Steve Loughran
On 18 December 2012 09:11, Colin McCabe cmcc...@alumni.cmu.edu wrote:

 On Tue, Dec 18, 2012 at 1:05 AM, Colin McCabe cmcc...@alumni.cmu.edu
 wrote:

 
  another tactic could be to have specific test projects: test-s3,
  test-openstack, test-... which contain nothing but test cases. You'd set
  jenkins up those test projects too -the reason for having the separate
  names is to make it blatantly clear which tests you've not run
 
  I dunno.  Every time a project puts unit or system tests into a
  separate project, the developers never run them.  I've seen it happen
  enough times that I think I can call it an anti-pattern by now.  I
  like having tests alongside the code-- to the maximum extent that is
  possible.

 Just to be clear, I'm not referring to any Hadoop-related project
 here, just certain other open source (and not) ones I've worked on.
 System/unit tests belong with the rest of the code, otherwise they get
 stale real fast.

 It sometimes makes sense for integration tests to live in a separate
 repo, since by their nature they're usually talking to stuff that
 lives in multiple repos.

 best,
 Colin

 Oh, I understood that. Even with jenkins set up to build a chain of
projects, there's a risk (in my experience at a former employer ) that the
people upstream wouldn't correlate mail from jenkins project D test
failing with their action to commit something.

Even so, there's always conflict between short-run unit tests and full
tests on a cluster of size 1. a short test cycle boosts desktop dev, but
you still want to be thorough.


Re: making a hadoop-common test run if a property is set

2012-12-18 Thread Steve Loughran
On 18 December 2012 09:05, Colin McCabe cmcc...@alumni.cmu.edu wrote:


 I think the way to go is to have one XML file include another.

 ?xml version=1.0?
 ?xml-stylesheet type=text/xsl href=configuration.xsl?
 configuration xmlns:xi=http://www.w3.org/2001/XInclude;
   property
  nameboring.config.1/name
  valueboring-value/value
  ... etc, etc...
   xi:include href=../secret-stuff.xml /
 /configuration

 That way, you can keep the boring configuration under version control,
 and still have your password sitting in a small separate
 non-version-controlled XML file.

 We use this trick a bunch with the HA configuration stuff-- 99% of the
 configuration is the same between the Active and Standby Namenodes,
 but you can't give them the same dfs.ha.namenode.id or dfs.name.dir.
 Includes help a lot here.

 I like this approach -we could even have xi:fallback within the include
element to say include this other file if nothing is checked in.

the default, checked in, -site.xml could then go

xi:include href=../custom-stuff.xmlxi:fallback href=../empty.xml
//xi:include

I'll try this on my tests to see how well it holds up, because if it does
work it is something to consider checking in. (Yes, I know xinclude isn't
there 100% of the time client side, but that's not going to happen on test
runs - https://issues.apache.org/jira/browse/HADOOP-5254 )


Re: making a hadoop-common test run if a property is set

2012-12-18 Thread Steve Loughran
On 17 December 2012 16:06, Tom White t...@cloudera.com wrote:

 There are some tests like the S3 tests that end with Test (e.g.
 Jets3tNativeS3FileSystemContractTest) - unlike normal tests which
 start with Test. Only those that start with Test are run
 automatically (see the surefire configuration in
 hadoop-project/pom.xml). You have to run the others manually with mvn
 test -Dtest=


thinking some more (especially, how to make subclasses of
FileSystemContractTestBase optional without patching that base class), we
could add public static suite() methods to the child classes and have them
skip all the tests if they are optional.

I haven't abused the suite() method for a while; it used to be the only way
to do parameterized tests, but in theory this should work -though there's
the maintenance overhead of keeping the list of test methods to return up
to date


Re: making a hadoop-common test run if a property is set

2012-12-17 Thread Steve Loughran
thanks, I'l; have a look. I've always wanted to add the notion of skipped
to test runs -all the way through to the XML and generated reports, but
you'd have to do a new junit runner for this and tweak the reporting code.
Which, if it involved going near maven source, is not something I am
prepared to do

On 14 December 2012 18:57, Colin McCabe cmcc...@alumni.cmu.edu wrote:

 One approach we've taken in the past is making the junit test skip
 itself when some precondition is not true.  Then, we often create a
 property which people can use to cause the skipped tests to become a
 hard error.

 For example, all the tests that rely on libhadoop start with these lines:

  @Test
  public void myTest() {
 Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
...
  }

 This causes them to be silently skipped when libhadoop.so is not
 available or loaded (perhaps because it hasn't been built.)

 However, if you want to cause this to be a hard error, you simply run
  mvn test -Drequire.test.libhadoop

 See TestHdfsNativeCodeLoader.java to see how this is implemented.

 The main idea is that your Jenkins build slaves use all the -Drequire
 lines, but people running tests locally are not inconvenienced by the
 need to build libhadoop.so in every case.  This is especially good
 because libhadoop.so isn't known to build on certain platforms like
 AIX, etc.  It seems to be a good tradeoff so far.  I imagine that s3
 could do something similar.

 cheers,
 Colin


 On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran ste...@hortonworks.com
 wrote:
  The swiftfs tests need only to run if there's a target filesystem;
 copying
  the s3/s3n tests, something like
 
property
  nametest.fs.swift.name/name
  valueswift://your-object-store-herel//value
/property
 
  How does one actually go about making junit tests optional in mvn-land?
  Should the probe/skip logic be in the code -which can make people think
 the
  test passed when it didn't actually run? Or can I turn it on/off in
 maven?
 
  -steve



Re: making a hadoop-common test run if a property is set

2012-12-17 Thread Tom White
There are some tests like the S3 tests that end with Test (e.g.
Jets3tNativeS3FileSystemContractTest) - unlike normal tests which
start with Test. Only those that start with Test are run
automatically (see the surefire configuration in
hadoop-project/pom.xml). You have to run the others manually with mvn
test -Dtest=

The mechanism that Colin describes is probably better though, since
the environment-specific tests can be run as a part of a full test run
by Jenkins if configured appropriately.

Tom

On Mon, Dec 17, 2012 at 10:06 AM, Steve Loughran ste...@hortonworks.com wrote:
 thanks, I'l; have a look. I've always wanted to add the notion of skipped
 to test runs -all the way through to the XML and generated reports, but
 you'd have to do a new junit runner for this and tweak the reporting code.
 Which, if it involved going near maven source, is not something I am
 prepared to do

 On 14 December 2012 18:57, Colin McCabe cmcc...@alumni.cmu.edu wrote:

 One approach we've taken in the past is making the junit test skip
 itself when some precondition is not true.  Then, we often create a
 property which people can use to cause the skipped tests to become a
 hard error.

 For example, all the tests that rely on libhadoop start with these lines:

  @Test
  public void myTest() {
 Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
...
  }

 This causes them to be silently skipped when libhadoop.so is not
 available or loaded (perhaps because it hasn't been built.)

 However, if you want to cause this to be a hard error, you simply run
  mvn test -Drequire.test.libhadoop

 See TestHdfsNativeCodeLoader.java to see how this is implemented.

 The main idea is that your Jenkins build slaves use all the -Drequire
 lines, but people running tests locally are not inconvenienced by the
 need to build libhadoop.so in every case.  This is especially good
 because libhadoop.so isn't known to build on certain platforms like
 AIX, etc.  It seems to be a good tradeoff so far.  I imagine that s3
 could do something similar.

 cheers,
 Colin


 On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran ste...@hortonworks.com
 wrote:
  The swiftfs tests need only to run if there's a target filesystem;
 copying
  the s3/s3n tests, something like
 
property
  nametest.fs.swift.name/name
  valueswift://your-object-store-herel//value
/property
 
  How does one actually go about making junit tests optional in mvn-land?
  Should the probe/skip logic be in the code -which can make people think
 the
  test passed when it didn't actually run? Or can I turn it on/off in
 maven?
 
  -steve



Re: making a hadoop-common test run if a property is set

2012-12-17 Thread Steve Loughran
On 17 December 2012 16:06, Tom White t...@cloudera.com wrote:

 There are some tests like the S3 tests that end with Test (e.g.
 Jets3tNativeS3FileSystemContractTest) - unlike normal tests which
 start with Test. Only those that start with Test are run
 automatically (see the surefire configuration in
 hadoop-project/pom.xml). You have to run the others manually with mvn
 test -Dtest=

 The mechanism that Colin describes is probably better though, since
 the environment-specific tests can be run as a part of a full test run
 by Jenkins if configured appropriately.


I'd like that -though one problem with the current system is that you need
to get the s3 (and soon: openstack) credentials into
src/test/resources/core-site.xml, which isn't the right approach. If we
could get them into properties files things would be easier.

another tactic could be to have specific test projects: test-s3,
test-openstack, test-... which contain nothing but test cases. You'd set
jenkins up those test projects too -the reason for having the separate
names is to make it blatantly clear which tests you've not run

That's overkill for adding a few more openstack tests -but I would like to
make it easier to turn those and the rackspace ones without sticking my
secrets into an XML file under SCM



 Tom

 On Mon, Dec 17, 2012 at 10:06 AM, Steve Loughran ste...@hortonworks.com
 wrote:
  thanks, I'l; have a look. I've always wanted to add the notion of skipped
  to test runs -all the way through to the XML and generated reports, but
  you'd have to do a new junit runner for this and tweak the reporting
 code.
  Which, if it involved going near maven source, is not something I am
  prepared to do
 
  On 14 December 2012 18:57, Colin McCabe cmcc...@alumni.cmu.edu wrote:
 
  One approach we've taken in the past is making the junit test skip
  itself when some precondition is not true.  Then, we often create a
  property which people can use to cause the skipped tests to become a
  hard error.
 
  For example, all the tests that rely on libhadoop start with these
 lines:
 
   @Test
   public void myTest() {
  Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
 ...
   }
 
  This causes them to be silently skipped when libhadoop.so is not
  available or loaded (perhaps because it hasn't been built.)
 
  However, if you want to cause this to be a hard error, you simply run
   mvn test -Drequire.test.libhadoop
 
  See TestHdfsNativeCodeLoader.java to see how this is implemented.
 
  The main idea is that your Jenkins build slaves use all the -Drequire
  lines, but people running tests locally are not inconvenienced by the
  need to build libhadoop.so in every case.  This is especially good
  because libhadoop.so isn't known to build on certain platforms like
  AIX, etc.  It seems to be a good tradeoff so far.  I imagine that s3
  could do something similar.
 
  cheers,
  Colin
 
 
  On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran ste...@hortonworks.com
 
  wrote:
   The swiftfs tests need only to run if there's a target filesystem;
  copying
   the s3/s3n tests, something like
  
 property
   nametest.fs.swift.name/name
   valueswift://your-object-store-herel//value
 /property
  
   How does one actually go about making junit tests optional in
 mvn-land?
   Should the probe/skip logic be in the code -which can make people
 think
  the
   test passed when it didn't actually run? Or can I turn it on/off in
  maven?
  
   -steve
 



making a hadoop-common test run if a property is set

2012-12-14 Thread Steve Loughran
The swiftfs tests need only to run if there's a target filesystem; copying
the s3/s3n tests, something like

  property
nametest.fs.swift.name/name
valueswift://your-object-store-herel//value
  /property

How does one actually go about making junit tests optional in mvn-land?
Should the probe/skip logic be in the code -which can make people think the
test passed when it didn't actually run? Or can I turn it on/off in maven?

-steve


Re: making a hadoop-common test run if a property is set

2012-12-14 Thread Colin McCabe
One approach we've taken in the past is making the junit test skip
itself when some precondition is not true.  Then, we often create a
property which people can use to cause the skipped tests to become a
hard error.

For example, all the tests that rely on libhadoop start with these lines:

 @Test
 public void myTest() {
Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
   ...
 }

This causes them to be silently skipped when libhadoop.so is not
available or loaded (perhaps because it hasn't been built.)

However, if you want to cause this to be a hard error, you simply run
 mvn test -Drequire.test.libhadoop

See TestHdfsNativeCodeLoader.java to see how this is implemented.

The main idea is that your Jenkins build slaves use all the -Drequire
lines, but people running tests locally are not inconvenienced by the
need to build libhadoop.so in every case.  This is especially good
because libhadoop.so isn't known to build on certain platforms like
AIX, etc.  It seems to be a good tradeoff so far.  I imagine that s3
could do something similar.

cheers,
Colin


On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran ste...@hortonworks.com wrote:
 The swiftfs tests need only to run if there's a target filesystem; copying
 the s3/s3n tests, something like

   property
 nametest.fs.swift.name/name
 valueswift://your-object-store-herel//value
   /property

 How does one actually go about making junit tests optional in mvn-land?
 Should the probe/skip logic be in the code -which can make people think the
 test passed when it didn't actually run? Or can I turn it on/off in maven?

 -steve