I think the most important thing is building at least once against the 4 Hadoop versions, and building at least once against the 3 JDK versions. It's very unlikely that a particular JDK + Hadoop version fails to compile, while the same JDK with another Hadoop, or the same Hadoop with another JDK, does.
I think you could get away with 4: 1.2.1 - 6 2.0.0-alpha - 6 2.2.0 - 7 2.5.0 - 8 These at least pairs old JDK with old Hadoop. I am not sure Hadoop < 2.2 even reliably works with Java 7, for example? testing Java 8 + Hadoop 1.2.1 is probably pointless, for example. You can add back a few more pairs here and there if this feels too sparse. On Wed, Aug 27, 2014 at 10:40 AM, Robert Metzger <[email protected]> wrote: > Hi guys, > > while creating the 0.6-incubating release I noticed that often build issues > are triggered by changing dependencies. > In particular we allow users to set the version of the Hadoop dependency. > > Right now, we test the following variants: > > (oraclejdk8, oraclejdk7, openjdk6) x (hadoop 1.2.1, hadoop 2.2.0) > > Accidentially, I found out that the recently merged streaming component > does not build with hadoop 2.4.0 as a dependency ( > https://issues.apache.org/jira/browse/FLINK-1065). > > I'm suggesting to add the following versions into the pool of Hadoop > versions we test against: > 1) "hadoop 2.0.0-alpha" > 2 "hadoop 2.5.0" > > 1) is going to be the replacement for the "cdh4" package, and I think we > should test versions we are going to ship with releases. ( > https://issues.apache.org/jira/browse/FLINK-1068) > 2) is the current stable hadoop version. I think we should test against > hadoop 2.2.0 and the latest stable hadoop version. > > Adding these two versions would result in 3x4 = 12 builds per push / pull > request, which is a lot given that we can only run 5 tests in parallel. > Therefore, I'm suggesting to add just 2 builds with "oraclejdk8" and the > two new hadoop versions. > > Opinions? > > > -- Robert
