Happy Diwali everyone!!!
Xiao
Hi, Yuming.
Is the project working correctly on JDK8 with you?
When I simply cloned your repo and did `mvn clean package` on
JDK 1.8.0_232, it seems not to pass the UTs.
I also tried to rerun after ignoring two ORC table test like the
followings, but the UT is failing.
It seems not a Hadoop issue, doesn't it?
What Yuming pointed seems to be `Hive 2.3.6` profile implementation issue
which is enabled only when `Hadoop 3.2`.
>From my side, I'm +1 for publishing jars which depends on `Hadoop 3.2.0 /
Hive 2.3.6` jars to Maven since Apache Spark 3.0.0.
For the
Is the Spark artifact actually any different between those builds? I
thought it just affected what else was included in the binary tarball.
If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
2.12 is the only supported Scala version).
On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang
Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven
repository? Otherwise it will throw a NoSuchMethodError on Java 11.
Here is an example:
https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38