[ 
https://issues.apache.org/jira/browse/SPARK-24417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16774954#comment-16774954
 ] 

M. Le Bihan commented on SPARK-24417:
-------------------------------------

It becomes really troublesome to see Java 12 coming in few weeks while _Spark_ 
that is somewhat an impressive development in term of technology is hold on a 
JVM of year 2014. I have three questions, please :

1) What version of Spark will become compatible with Java 11 ? 2.4.1, 2.4.2 or 
3.0.0 ?

2) If Java 11 compatibility is postponed to Spark 3.0.0, when Spark 3.0.0 is 
planned to be released ?

3) Will Spark become fully compatible with standard, classical, normal Java 
then, or will it keep some kind of system programming that might keep him in 
jeopardy ? In one word : will he suffer the same troubles when attempting to 
run with Java 12, 13, 14 ?

 

Since the coming of Java 9, now Java 11, and at the door of Java 12, 18 months 
have passed. Can we have a date for Java 11 (and Java 12) compatibility will be 
available please ?

 

> Build and Run Spark on JDK11
> ----------------------------
>
>                 Key: SPARK-24417
>                 URL: https://issues.apache.org/jira/browse/SPARK-24417
>             Project: Spark
>          Issue Type: New Feature
>          Components: Build
>    Affects Versions: 2.3.0
>            Reporter: DB Tsai
>            Priority: Major
>
> This is an umbrella JIRA for Apache Spark to support JDK11
> As JDK8 is reaching EOL, and JDK9 and 10 are already end of life, per 
> community discussion, we will skip JDK9 and 10 to support JDK 11 directly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to