[ 
https://issues.apache.org/jira/browse/SPARK-24417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16676210#comment-16676210
 ] 

DB Tsai commented on SPARK-24417:
---------------------------------

Scala community is working on official support for JDK11 
[https://github.com/scala/scala-dev/issues/559] in Scala 2.12.x. Since majority 
of Spark users are still on Scala 2.11, I raised this question if they will 
backport JDK11 work to Scala 2.11.x in the Scala community. If not, we might 
consider to move to Scala 2.12 as default Scala version in Spark 3.0 as it's a 
major release.

> Build and Run Spark on JDK11
> ----------------------------
>
>                 Key: SPARK-24417
>                 URL: https://issues.apache.org/jira/browse/SPARK-24417
>             Project: Spark
>          Issue Type: New Feature
>          Components: Build
>    Affects Versions: 2.3.0
>            Reporter: DB Tsai
>            Priority: Major
>
> This is an umbrella JIRA for Apache Spark to support JDK11
> As JDK8 is reaching EOL, and JDK9 and 10 are already end of life, per 
> community discussion, we will skip JDK9 and 10 to support JDK 11 directly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to