I don't think JDK 11 is a separate release (by design). We build
everything targeting JDK 8 and it should work on JDK 11 too.

So, just two releases, but, frankly I think we soon need to stop
multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
I think it's fine to try to release for Hadoop 2 as the support still
exists, and because the difference happens to be larger due to the
different Hive dependency.

On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <jiangxb1...@gmail.com> wrote:
>
> Hi all,
>
> I would like to bring out a discussion on how many packages shall be released 
> in 3.0.0-preview, the ones I can think of now:
>
> * scala 2.12 + hadoop 2.7
> * scala 2.12 + hadoop 3.2
> * scala 2.12 + hadoop 3.2 + JDK 11
>
> Do you have other combinations to add to the above list?
>
> Cheers,
>
> Xingbo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to