It seems not a Hadoop issue, doesn't it?

What Yuming pointed seems to be `Hive 2.3.6` profile implementation issue
which is enabled only when `Hadoop 3.2`.

>From my side, I'm +1 for publishing jars which depends on `Hadoop 3.2.0 /
Hive 2.3.6` jars to Maven since Apache Spark 3.0.0.

For the others, I'd like to mention that this implies the followings, too.

1. We are not going to use Hive 1.2.1 library. Only Hadoop-2.7 profile
tarball distribution will use Hive 1.2.1.
2. Although we depends on Hadoop 3.2.0, Hadoop 3.2.1 changes their Guava
library version significantly.
    So, it requires some attentions in Apache Spark. Otherwise, we may hit
some issues on Hadoop 3.2.1+ runtime later.

Thanks,
Dongjoon.


On Sun, Oct 27, 2019 at 7:31 AM Sean Owen <sro...@gmail.com> wrote:

> Is the Spark artifact actually any different between those builds? I
> thought it just affected what else was included in the binary tarball.
> If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
> 2.12 is the only supported Scala version).
>
> On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang <wgy...@gmail.com> wrote:
> >
> > Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the
> Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
> > Here is an example:
> >
> https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
> >
> https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578
> >
> >
> > On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <linguin....@gmail.com>
> wrote:
> >>
> >> Thanks for that work!
> >>
> >> > I don't think JDK 11 is a separate release (by design). We build
> >> > everything targeting JDK 8 and it should work on JDK 11 too.
> >> +1. a single package working on both jvms looks nice.
> >>
> >>
> >> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <sro...@gmail.com> wrote:
> >>>
> >>> I don't think JDK 11 is a separate release (by design). We build
> >>> everything targeting JDK 8 and it should work on JDK 11 too.
> >>>
> >>> So, just two releases, but, frankly I think we soon need to stop
> >>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
> >>> I think it's fine to try to release for Hadoop 2 as the support still
> >>> exists, and because the difference happens to be larger due to the
> >>> different Hive dependency.
> >>>
> >>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <jiangxb1...@gmail.com>
> wrote:
> >>> >
> >>> > Hi all,
> >>> >
> >>> > I would like to bring out a discussion on how many packages shall be
> released in 3.0.0-preview, the ones I can think of now:
> >>> >
> >>> > * scala 2.12 + hadoop 2.7
> >>> > * scala 2.12 + hadoop 3.2
> >>> > * scala 2.12 + hadoop 3.2 + JDK 11
> >>> >
> >>> > Do you have other combinations to add to the above list?
> >>> >
> >>> > Cheers,
> >>> >
> >>> > Xingbo
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>>
> >>
> >>
> >> --
> >> ---
> >> Takeshi Yamamuro
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to