>
> Probably, but what is the difference that makes it different to
> support u81 vs later?
>

How about docker support
https://blog.softwaremill.com/docker-support-in-new-java-8-finally-fd595df0ca54



On Fri, Oct 25, 2019 at 9:05 AM Takeshi Yamamuro <linguin....@gmail.com>
wrote:

> Hi, Dongjoon
>
> It might be worth clearly describing which jdk versions we check in the
> testing infra
> in some documents, e.g., https://spark.apache.org/docs/latest/#downloading
>
> btw, any other project announcing the minimum support jdk version?
> It seems that hadoop does not.
>
> On Fri, Oct 25, 2019 at 6:51 AM Sean Owen <sro...@gmail.com> wrote:
>
>> Probably, but what is the difference that makes it different to
>> support u81 vs later?
>>
>> On Thu, Oct 24, 2019 at 4:39 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
>> wrote:
>> >
>> > Hi, All.
>> >
>> > Apache Spark 3.x will support both JDK8 and JDK11.
>> >
>> > I'm wondering if we can have a minimum JDK8 version in Apache Spark 3.0.
>> >
>> > Specifically, can we start to deprecate JDK8u81 and older at 3.0.
>> >
>> > Currently, Apache Spark testing infra are testing only with
>> jdk1.8.0_191 and above.
>> >
>> > Bests,
>> > Dongjoon.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>
> --
> ---
> Takeshi Yamamuro
>

Reply via email to