So the above issue occurs at build and test a maven project with Spark 3.3.0 
and Java 17, rather than test spark-3.3 source code?

If yes, you may need to add the following Java Options to `argLine` of 
`maven-surefire-plugin` for Java 17:

```
--add-opens=java.base/java.lang=ALL-UNNAMED
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED
--add-opens=java.base/java.io=ALL-UNNAMED
--add-opens=java.base/java.net=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
--add-opens=java.base/java.util=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED
--add-opens=java.base/sun.nio.cs=ALL-UNNAMED
--add-opens=java.base/sun.security.action=ALL-UNNAMED
--add-opens=java.base/sun.util.calendar=ALL-UNNAMED
```

These options are used to pass all Spark UTs, but maybe you don't need all.

However, these Options needn't explicit add when using spark-shell, spark-sql 
and spark-submit, but may need to add others as needed for Java 17.

Maybe some instructions should be added to the document

Yang Jie





发件人: Greg Kopff <g...@q10stats.com>
日期: 2022年6月23日 星期四 14:11
收件人: "Yang,Jie(INF)" <yangji...@baidu.com>
抄送: "user@spark.apache.org" <user@spark.apache.org>
主题: Re: [Java 17] --add-exports required?

Hi.

I am running on macOS 12.4, using an ‘Adoptium’ JDK from 
https://adoptium.net/download<https://mailshield.baidu.com/check?q=U8F1V2tHFnSLZMX%2fpIYOpCo623EkCAJTvS41G4mer6y1V2iN>.
 The version details are:

$ java -version
openjdk version "17.0.3" 2022-04-19
OpenJDK Runtime Environment Temurin-17.0.3+7 (build 17.0.3+7)
OpenJDK 64-Bit Server VM Temurin-17.0.3+7 (build 17.0.3+7, mixed mode, sharing)
I have attached an example maven project which demonstrates the error.



If you run 'mvn clean test' it should fail with:

[ERROR] ExampleTest  Time elapsed: 1.194 s  <<< ERROR!
java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in 
unnamed module @0x41a962cf) cannot access class sun.nio.ch.DirectBuffer (in 
module java.base) because module java.base does not export 
sun.nio.ch<http://sun.nio.ch> to unnamed module @0x41a962cf

Some of the diagnostic output from running with Maven with the -X flag is:

Apache Maven 3.8.6 (84538c9988a25aec085021c365c560670ad80f63)
Maven home: /usr/local/apache-maven/apache-maven-3.8.6
Java version: 17.0.3, vendor: Eclipse Adoptium, runtime: 
/Library/Java/JavaVirtualMachines/temurin-17.jdk/Contents/Home
Default locale: en_AU, platform encoding: UTF-8
OS name: "mac os x", version: "12.4", arch: "x86_64", family: “mac"

[DEBUG] boot(compact) classpath:  surefire-booter-3.0.0-M7.jar  
surefire-api-3.0.0-M7.jar  surefire-logger-api-3.0.0-M7.jar  
surefire-shared-utils-3.0.0-M7.jar  surefire-extensions-spi-3.0.0-M7.jar  
test-classes  classes  junit-4.13.2.jar  hamcrest-core-1.3.jar  
hamcrest-all-1.3.jar  spark-core_2.12-3.3.0.jar  avro-1.11.0.jar  
jackson-core-2.12.5.jar  commons-compress-1.21.jar  avro-mapred-1.11.0.jar  
avro-ipc-1.11.0.jar  xz-1.9.jar  chill_2.12-0.10.0.jar  kryo-shaded-4.0.2.jar  
minlog-1.3.0.jar  objenesis-2.5.1.jar  chill-java-0.10.0.jar  
xbean-asm9-shaded-4.20.jar  hadoop-client-api-3.3.2.jar  
hadoop-client-runtime-3.3.2.jar  commons-logging-1.1.3.jar  
spark-launcher_2.12-3.3.0.jar  spark-kvstore_2.12-3.3.0.jar  
leveldbjni-all-1.8.jar  jackson-annotations-2.13.3.jar  
spark-network-common_2.12-3.3.0.jar  tink-1.6.1.jar  gson-2.8.6.jar  
spark-network-shuffle_2.12-3.3.0.jar  spark-unsafe_2.12-3.3.0.jar  
activation-1.1.1.jar  curator-recipes-2.13.0.jar  curator-framework-2.13.0.jar  
curator-client-2.13.0.jar  guava-16.0.1.jar  zookeeper-3.6.2.jar  
commons-lang-2.6.jar  zookeeper-jute-3.6.2.jar  audience-annotations-0.5.0.jar  
jakarta.servlet-api-4.0.3.jar  commons-codec-1.15.jar  commons-lang3-3.12.0.jar 
 commons-math3-3.6.1.jar  commons-text-1.9.jar  commons-io-2.11.0.jar  
commons-collections-3.2.2.jar  commons-collections4-4.4.jar  jsr305-3.0.0.jar  
slf4j-api-1.7.32.jar  jul-to-slf4j-1.7.32.jar  jcl-over-slf4j-1.7.32.jar  
log4j-slf4j-impl-2.17.2.jar  log4j-api-2.17.2.jar  log4j-core-2.17.2.jar  
log4j-1.2-api-2.17.2.jar  compress-lzf-1.1.jar  snappy-java-1.1.8.4.jar  
lz4-java-1.8.0.jar  zstd-jni-1.5.2-1.jar  RoaringBitmap-0.9.25.jar  
shims-0.9.25.jar  scala-xml_2.12-1.2.0.jar  scala-library-2.12.15.jar  
scala-reflect-2.12.15.jar  json4s-jackson_2.12-3.7.0-M11.jar  
json4s-core_2.12-3.7.0-M11.jar  json4s-ast_2.12-3.7.0-M11.jar  
json4s-scalap_2.12-3.7.0-M11.jar  jersey-client-2.34.jar  
jakarta.ws.rs<http://jakarta.ws.rs>-api-2.1.6.jar  jakarta.inject-2.6.1.jar  
jersey-common-2.34.jar  jakarta.annotation-api-1.3.5.jar  
osgi-resource-locator-1.0.3.jar  jersey-server-2.34.jar  
jakarta.validation-api-2.0.2.jar  jersey-container-servlet-2.34.jar  
jersey-container-servlet-core-2.34.jar  jersey-hk2-2.34.jar  
hk2-locator-2.6.1.jar  aopalliance-repackaged-2.6.1.jar  hk2-api-2.6.1.jar  
hk2-utils-2.6.1.jar  javassist-3.25.0-GA.jar  netty-all-4.1.74.Final.jar  
netty-buffer-4.1.74.Final.jar  netty-codec-4.1.74.Final.jar  
netty-common-4.1.74.Final.jar  netty-handler-4.1.74.Final.jar  
netty-tcnative-classes-2.0.48.Final.jar  netty-resolver-4.1.74.Final.jar  
netty-transport-4.1.74.Final.jar  
netty-transport-classes-epoll-4.1.74.Final.jar  
netty-transport-native-unix-common-4.1.74.Final.jar  
netty-transport-classes-kqueue-4.1.74.Final.jar  
netty-transport-native-epoll-4.1.74.Final-linux-x86_64.jar  
netty-transport-native-epoll-4.1.74.Final-linux-aarch_64.jar  
netty-transport-native-kqueue-4.1.74.Final-osx-x86_64.jar  
netty-transport-native-kqueue-4.1.74.Final-osx-aarch_64.jar  stream-2.9.6.jar  
metrics-core-4.2.7.jar  metrics-jvm-4.2.7.jar  metrics-json-4.2.7.jar  
metrics-graphite-4.2.7.jar  metrics-jmx-4.2.7.jar  jackson-databind-2.13.3.jar  
jackson-module-scala_2.12-2.13.3.jar  paranamer-2.8.jar  ivy-2.5.0.jar  
oro-2.0.8.jar  pickle-1.2.jar  py4j-0.10.9.5.jar  spark-tags_2.12-3.3.0.jar  
commons-crypto-1.1.0.jar  unused-1.0.0.jar  spark-sql_2.12-3.3.0.jar  
rocksdbjni-6.20.3.jar  univocity-parsers-2.9.1.jar  spark-sketch_2.12-3.3.0.jar 
 spark-catalyst_2.12-3.3.0.jar  scala-parser-combinators_2.12-1.1.2.jar  
janino-3.0.16.jar  commons-compiler-3.0.16.jar  antlr4-runtime-4.8.jar  
arrow-vector-7.0.0.jar  arrow-format-7.0.0.jar  arrow-memory-core-7.0.0.jar  
flatbuffers-java-1.12.0.jar  arrow-memory-netty-7.0.0.jar  orc-core-1.7.4.jar  
orc-shims-1.7.4.jar  protobuf-java-2.5.0.jar  aircompressor-0.21.jar  
annotations-17.0.0.jar  threeten-extra-1.5.0.jar  orc-mapreduce-1.7.4.jar  
hive-storage-api-2.7.2.jar  parquet-column-1.12.2.jar  
parquet-common-1.12.2.jar  parquet-encoding-1.12.2.jar  
parquet-hadoop-1.12.2.jar  parquet-format-structures-1.12.2.jar  
parquet-jackson-1.12.2.jar  surefire-junit4-3.0.0-M7.jar  
common-java5-3.0.0-M7.jar  common-junit3-3.0.0-M7.jar  
common-junit4-3.0.0-M7.jar
[DEBUG] Forking command line: /bin/sh -c cd '/Users/greg/devel/spark-java17' && 
'/Library/Java/JavaVirtualMachines/temurin-17.jdk/Contents/Home/bin/java' 
'-jar' 
'/Users/greg/devel/spark-java17/target/surefire/surefirebooter-20220623160237651_3.jar'
 '/Users/greg/devel/spark-java17/target/surefire' 
'2022-06-23T16-02-37_489-jvmRun1' 'surefire-20220623160237651_1tmp' 
'surefire_0-20220623160237651_2tmp'
[DEBUG] Fork Channel [1] connected to the client.
[INFO] Running ExampleTest
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
22/06/23 16:02:38 INFO SparkContext: Running Spark version 3.3.0
22/06/23 16:02:38 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
22/06/23 16:02:38 INFO ResourceUtils: 
==============================================================
22/06/23 16:02:38 INFO ResourceUtils: No custom resources configured for 
spark.driver.
22/06/23 16:02:38 INFO ResourceUtils: 
==============================================================
22/06/23 16:02:38 INFO SparkContext: Submitted application: 
e42fa5ea-9b36-42e9-86fe-0a8d58f451b8
22/06/23 16:02:38 INFO ResourceProfile: Default ResourceProfile created, 
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , 
memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: 
offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: 
cpus, amount: 1.0)
22/06/23 16:02:38 INFO ResourceProfile: Limiting resource is cpu
22/06/23 16:02:38 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/06/23 16:02:38 INFO SecurityManager: Changing view acls to: greg
22/06/23 16:02:38 INFO SecurityManager: Changing modify acls to: greg
22/06/23 16:02:38 INFO SecurityManager: Changing view acls groups to:
22/06/23 16:02:38 INFO SecurityManager: Changing modify acls groups to:
22/06/23 16:02:38 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users  with view permissions: Set(greg); groups 
with view permissions: Set(); users  with modify permissions: Set(greg); groups 
with modify permissions: Set()
22/06/23 16:02:39 INFO Utils: Successfully started service 'sparkDriver' on 
port 56377.
22/06/23 16:02:39 INFO SparkEnv: Registering MapOutputTracker
22/06/23 16:02:39 INFO SparkEnv: Registering BlockManagerMaster
22/06/23 16:02:39 INFO BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/06/23 16:02:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 1.2 s 
<<< FAILURE! - in ExampleTest
[ERROR] ExampleTest  Time elapsed: 1.194 s  <<< ERROR!
java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in 
unnamed module @0x41a962cf) cannot access class sun.nio.ch.DirectBuffer (in 
module java.base) because module java.base does not export 
sun.nio.ch<http://sun.nio.ch> to unnamed module @0x41a962cf
at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
If I change:

      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>3.0.0-M7</version>
      </plugin>
    </plugins>

to:

      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>3.0.0-M7</version>
        <configuration>
          <argLine>--add-exports 
java.base/sun.nio.ch<http://sun.nio.ch>=ALL-UNNAMED</argLine>
        </configuration>
      </plugin>
    </plugins>

… then the tests run successfully.

If there’s anything else you need, please let me know.

—
Greg.


On 23 Jun 2022, at 3:42 pm, Yang,Jie(INF) 
<yangji...@baidu.com<mailto:yangji...@baidu.com>> wrote:

Hi, Greg

"--add-exports java.base/sun.nio.ch<http://sun.nio.ch>=ALL-UNNAMED " does not 
need to be added when SPARK-33772 is completed, so in order to answer your 
question, I need more details for testing:
1.  Where can I download Java 17 (Temurin-17+35)?
2.  What test commands do you use?

Yang Jie

在 2022/6/23 12:54,“Greg Kopff”<g...@q10stats.com<mailto:g...@q10stats.com>> 写入:

   Hi.

   According to the release notes[1], and specifically the ticket Build and Run 
Spark on Java 17 (SPARK-33772)[2], Spark now supports running on Java 17.

   However, using Java 17 (Temurin-17+35) with Maven (3.8.6) and 
maven-surefire-plugin (3.0.0-M7), when running a unit test that uses Spark 
(3.3.0), it fails with:

   java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ 
(in unnamed module @0x1e7ba8d9) cannot access class sun.nio.ch.DirectBuffer (in 
module java.base) because module java.base does not export 
sun.nio.ch<http://sun.nio.ch> to unnamed module @0x1e7ba8d9

   The full stack is:

   java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ 
(in unnamed module @0x1e7ba8d9) cannot access class sun.nio.ch.DirectBuffer (in 
module java.base) because module java.base does not export 
sun.nio.ch<http://sun.nio.ch> to unnamed module @0x1e7ba8d9
     at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
     at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
     at 
org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114)
     at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353)
     at 
org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290)
     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339)
     at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194)
     at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:464)
     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
     at 
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
     at scala.Option.getOrElse(Option.scala:189)
     at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
     […]

   There is a recent StackOverflow question "Java 17 solution for Spark - 
java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.storage.StorageUtils"[3], which was asked only 2 months ago, 
but this pre-dated the Spark 3.3.0 release, and thus predated official support 
for Java 17.  The solution proposed there results in us adding this 
configuration to the Surefire plugin:

   <configuration>
     <argLine>--add-exports 
java.base/sun.nio.ch<http://sun.nio.ch>=ALL-UNNAMED</argLine>
   </configuration>

   And, yes, this works.

   Now, I understand what this flag achieves … without it the JVM module system 
won’t allow Spark to use the sun.nio.ch.DirectBuffer class.  My question is if 
the requirement to add this flag is currently documented somewhere?  I couldn’t 
find its and it’s likely to start affecting people when they switch to Java 17. 
 Right now the web is mostly full of suggestions to use an earlier version of 
Java.

   Cheers,

   —
   Greg.


   [1]: https://spark.apache.org/releases/spark-release-3-3-0.html
   [2]: https://issues.apache.org/jira/browse/SPARK-33772
   [3]: https://stackoverflow.com/questions/72230174
   ---------------------------------------------------------------------
   To unsubscribe e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>


Reply via email to