> All versions of JDK8 are not the same naturally. For example, Hadoop
community also have the following document although they are not specifying
the minimum versions.
oh, I didn't know that. Thanks for the info and updating the doc!

Bests,
Takeshi

On Fri, Oct 25, 2019 at 12:26 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> Thank you. I created a PR for that. For now, the minimum requirement is
> 8u92 in that PR.
>
> https://github.com/apache/spark/pull/26249
>
> Bests,
> Dongjoon.
>
>
> On Thu, Oct 24, 2019 at 7:55 PM Sean Owen <sro...@gmail.com> wrote:
>
>> I think that's fine, personally. Anyone using JDK 8 should / probably
>> is on a recent release.
>>
>> On Thu, Oct 24, 2019 at 8:56 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
>> wrote:
>> >
>> > Thank you for reply, Sean, Shane, Takeshi.
>> >
>> > The reason is that there is a PR to aim to add
>> `-XX:OnOutOfMemoryError="kill -9 %p"` as a default behavior at 3.0.0.
>> > (Please note that the PR will add it by *default* always. There is no
>> way for user to remove it.)
>> >
>> >     - [SPARK-27900][CORE][K8s] Add `spark.driver.killOnOOMError` flag
>> in cluster mode
>> >     - https://github.com/apache/spark/pull/26161
>> >
>> > If we can deprecate old JDK8 versions, we are able to use JVM option
>> `ExitOnOutOfMemoryError` instead.
>> > (This is added at JDK 8u92. In my previous email, 8u82 was a typo.)
>> >
>> >     -
>> https://www.oracle.com/technetwork/java/javase/8u92-relnotes-2949471.html
>> >
>> > All versions of JDK8 are not the same naturally. For example, Hadoop
>> community also have the following document although they are not specifying
>> the minimum versions.
>> >
>> >     -
>> https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions
>> >
>> > Bests,
>> > Dongjoon.
>> >
>> >
>> > On Thu, Oct 24, 2019 at 6:05 PM Takeshi Yamamuro <linguin....@gmail.com>
>> wrote:
>> >>
>> >> Hi, Dongjoon
>> >>
>> >> It might be worth clearly describing which jdk versions we check in
>> the testing infra
>> >> in some documents, e.g.,
>> https://spark.apache.org/docs/latest/#downloading
>> >>
>> >> btw, any other project announcing the minimum support jdk version?
>> >> It seems that hadoop does not.
>> >>
>> >> On Fri, Oct 25, 2019 at 6:51 AM Sean Owen <sro...@gmail.com> wrote:
>> >>>
>> >>> Probably, but what is the difference that makes it different to
>> >>> support u81 vs later?
>> >>>
>> >>> On Thu, Oct 24, 2019 at 4:39 PM Dongjoon Hyun <
>> dongjoon.h...@gmail.com> wrote:
>> >>> >
>> >>> > Hi, All.
>> >>> >
>> >>> > Apache Spark 3.x will support both JDK8 and JDK11.
>> >>> >
>> >>> > I'm wondering if we can have a minimum JDK8 version in Apache Spark
>> 3.0.
>> >>> >
>> >>> > Specifically, can we start to deprecate JDK8u81 and older at 3.0.
>> >>> >
>> >>> > Currently, Apache Spark testing infra are testing only with
>> jdk1.8.0_191 and above.
>> >>> >
>> >>> > Bests,
>> >>> > Dongjoon.
>> >>>
>> >>> ---------------------------------------------------------------------
>> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>>
>> >>
>> >>
>> >> --
>> >> ---
>> >> Takeshi Yamamuro
>>
>

-- 
---
Takeshi Yamamuro

Reply via email to