Hi, Yuming.

Is the project working correctly on JDK8 with you?

When I simply cloned your repo and did `mvn clean package` on
JDK 1.8.0_232, it seems not to pass the UTs.

I also tried to rerun after ignoring two ORC table test like the
followings, but the UT is failing.

~/A/test-spark-jdk11:master$ git diff | grep 'ORC table'
-  test("Datasource ORC table") {
+  ignore("Datasource ORC table") {
-  test("Hive ORC table") {
+  ignore("Hive ORC table") {

~/A/test-spark-jdk11:master$ mvn clean package
...
- Hive ORC table !!! IGNORED !!!
Run completed in 36 seconds, 999 milliseconds.
Total number of tests run: 2
Suites: completed 3, aborted 0
Tests: succeeded 1, failed 1, canceled 0, ignored 2, pending 0
*** 1 TEST FAILED ***

~/A/test-spark-jdk11:master$ java -version
openjdk version "1.8.0_232"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_232-b09)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.232-b09, mixed mode)


Bests,
Dongjoon.

On Sun, Oct 27, 2019 at 1:38 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> It seems not a Hadoop issue, doesn't it?
>
> What Yuming pointed seems to be `Hive 2.3.6` profile implementation issue
> which is enabled only when `Hadoop 3.2`.
>
> From my side, I'm +1 for publishing jars which depends on `Hadoop 3.2.0 /
> Hive 2.3.6` jars to Maven since Apache Spark 3.0.0.
>
> For the others, I'd like to mention that this implies the followings, too.
>
> 1. We are not going to use Hive 1.2.1 library. Only Hadoop-2.7 profile
> tarball distribution will use Hive 1.2.1.
> 2. Although we depends on Hadoop 3.2.0, Hadoop 3.2.1 changes their Guava
> library version significantly.
>     So, it requires some attentions in Apache Spark. Otherwise, we may hit
> some issues on Hadoop 3.2.1+ runtime later.
>
> Thanks,
> Dongjoon.
>
>
> On Sun, Oct 27, 2019 at 7:31 AM Sean Owen <sro...@gmail.com> wrote:
>
>> Is the Spark artifact actually any different between those builds? I
>> thought it just affected what else was included in the binary tarball.
>> If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
>> 2.12 is the only supported Scala version).
>>
>> On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang <wgy...@gmail.com> wrote:
>> >
>> > Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the
>> Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
>> > Here is an example:
>> >
>> https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
>> >
>> https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578
>> >
>> >
>> > On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <
>> linguin....@gmail.com> wrote:
>> >>
>> >> Thanks for that work!
>> >>
>> >> > I don't think JDK 11 is a separate release (by design). We build
>> >> > everything targeting JDK 8 and it should work on JDK 11 too.
>> >> +1. a single package working on both jvms looks nice.
>> >>
>> >>
>> >> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <sro...@gmail.com> wrote:
>> >>>
>> >>> I don't think JDK 11 is a separate release (by design). We build
>> >>> everything targeting JDK 8 and it should work on JDK 11 too.
>> >>>
>> >>> So, just two releases, but, frankly I think we soon need to stop
>> >>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
>> >>> I think it's fine to try to release for Hadoop 2 as the support still
>> >>> exists, and because the difference happens to be larger due to the
>> >>> different Hive dependency.
>> >>>
>> >>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <jiangxb1...@gmail.com>
>> wrote:
>> >>> >
>> >>> > Hi all,
>> >>> >
>> >>> > I would like to bring out a discussion on how many packages shall
>> be released in 3.0.0-preview, the ones I can think of now:
>> >>> >
>> >>> > * scala 2.12 + hadoop 2.7
>> >>> > * scala 2.12 + hadoop 3.2
>> >>> > * scala 2.12 + hadoop 3.2 + JDK 11
>> >>> >
>> >>> > Do you have other combinations to add to the above list?
>> >>> >
>> >>> > Cheers,
>> >>> >
>> >>> > Xingbo
>> >>>
>> >>> ---------------------------------------------------------------------
>> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>>
>> >>
>> >>
>> >> --
>> >> ---
>> >> Takeshi Yamamuro
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to