[ https://issues.apache.org/jira/browse/SPARK-33772?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17416218#comment-17416218 ]
Dongjoon Hyun edited comment on SPARK-33772 at 9/16/21, 4:28 PM: ----------------------------------------------------------------- I used your example. It seems that Spark can use that like the following. The following is Java 17/8 and Spark 3.2.0 RC2. {code} $ bin/spark-shell --driver-java-options "-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/jdk.internal.misc=ALL-UNNAMED" Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.2.0 /_/ Using Scala version 2.13.5 (OpenJDK 64-Bit Server VM, Java 17) Type in expressions to have them evaluated. Type :help for more information. 21/09/16 09:24:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1631809441130). Spark session available as 'spark'. {code} {code} $ bin/spark-shell --driver-java-options "-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/jdk.internal.misc=ALL-UNNAMED" Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.2.0 /_/ Using Scala version 2.13.5 (OpenJDK 64-Bit Server VM, Java 1.8.0_302) Type in expressions to have them evaluated. Type :help for more information. 21/09/16 09:27:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1631809673158). Spark session available as 'spark'. {code} was (Author: dongjoon): I used your example. It seems that Spark can use that like the following. The following is Java 17 and Spark 3.2.0 RC2. {code} $ bin/spark-shell --driver-java-options "-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/jdk.internal.misc=ALL-UNNAMED" Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.2.0 /_/ Using Scala version 2.13.5 (OpenJDK 64-Bit Server VM, Java 17) Type in expressions to have them evaluated. Type :help for more information. 21/09/16 09:24:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1631809441130). Spark session available as 'spark'. {code} > Build and Run Spark on Java 17 > ------------------------------ > > Key: SPARK-33772 > URL: https://issues.apache.org/jira/browse/SPARK-33772 > Project: Spark > Issue Type: New Feature > Components: Build > Affects Versions: 3.3.0 > Reporter: Dongjoon Hyun > Priority: Major > > Apache Spark supports Java 8 and Java 11 (LTS). The next Java LTS version is > 17. > ||Version||Release Date|| > |Java 17 (LTS)|September 2021| > Apache Spark has a release plan and `Spark 3.2 Code freeze` was July along > with the release branch cut. > - https://spark.apache.org/versioning-policy.html > Supporting new Java version is considered as a new feature which we cannot > allow to backport. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org