[jira] [Updated] (SPARK-35557) Adapt uses of JDK 17 Internal APIs

2021-05-31 Thread Jira


 [ 
https://issues.apache.org/jira/browse/SPARK-35557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated SPARK-35557:
-
Summary: Adapt uses of JDK 17 Internal APIs  (was: Adapt uses of JDK 17 
Internal APIs (Unsafe, etc))

> Adapt uses of JDK 17 Internal APIs
> --
>
> Key: SPARK-35557
> URL: https://issues.apache.org/jira/browse/SPARK-35557
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.2.0
>Reporter: Ismaël Mejía
>Priority: Major
>
> I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with 
> Spark 2.12.4 on Java 17 and I found this exception:
> {code:java}
> java.lang.ExceptionInInitializerError
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
> ...
> Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
> private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
> not "opens java.nio" to unnamed module @110df513
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:357)
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:297)
>  at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
>  at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
>  at org.apache.spark.unsafe.Platform. (Platform.java:56)
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
> {code}
> It seems that Java 17 will be more strict about uses of JDK Internals 
> [https://openjdk.java.net/jeps/403]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-35557) Adapt uses of JDK 17 Internal APIs (Unsafe, etc)

2021-05-31 Thread Jira


 [ 
https://issues.apache.org/jira/browse/SPARK-35557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated SPARK-35557:
-
Description: 
I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with Spark 
2.12.4 on Java 17 and I found this exception:
{code:java}
java.lang.ExceptionInInitializerError
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
...
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
not "opens java.nio" to unnamed module @110df513
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:357)
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:297)
 at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
 at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
 at org.apache.spark.unsafe.Platform. (Platform.java:56)
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
{code}
It seems that Java 17 will be more strict about uses of JDK Internals 
[https://openjdk.java.net/jeps/403]

  was:
I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with Spark 
2.13 on Java 17 and I found this exception:
{code:java}
java.lang.ExceptionInInitializerError
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
...
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
not "opens java.nio" to unnamed module @110df513
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:357)
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:297)
 at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
 at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
 at org.apache.spark.unsafe.Platform. (Platform.java:56)
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
{code}
It seems that Java 17 will be more strict about uses of JDK Internals 
[https://openjdk.java.net/jeps/403]


> Adapt uses of JDK 17 Internal APIs (Unsafe, etc)
> 
>
> Key: SPARK-35557
> URL: https://issues.apache.org/jira/browse/SPARK-35557
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.2.0
>Reporter: Ismaël Mejía
>Priority: Major
>
> I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with 
> Spark 2.12.4 on Java 17 and I found this exception:
> {code:java}
> java.lang.ExceptionInInitializerError
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
> ...
> Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
> private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
> not "opens java.nio" to unnamed module @110df513
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:357)
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:297)
>  at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
>  at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
>  at org.apache.spark.unsafe.Platform. (Platform.java:56)
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
> {code}
> It seems that Java 17 will be more strict about uses of JDK Internals 
> [https://openjdk.java.net/jeps/403]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-

[jira] [Updated] (SPARK-35557) Adapt uses of JDK 17 Internal APIs (Unsafe, etc)

2021-05-31 Thread Jira


 [ 
https://issues.apache.org/jira/browse/SPARK-35557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated SPARK-35557:
-
Description: 
I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with Spark 
2.13 on Java 17 and I found this exception:
{code:java}
java.lang.ExceptionInInitializerError
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
...
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
not "opens java.nio" to unnamed module @110df513
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:357)
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:297)
 at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
 at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
 at org.apache.spark.unsafe.Platform. (Platform.java:56)
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
{code}
It seems that Java 17 will be more strict about uses of JDK Internals 
[https://openjdk.java.net/jeps/403]

  was:
I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with Spark 
2.13 on Java 17 and I found this exception:

{code:borderStyle=solid}
java.lang.ExceptionInInitializerError
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
...
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
not "opens java.nio" to unnamed module @110df513
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:357)
 at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
(AccessibleObject.java:297)
 at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
 at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
 at org.apache.spark.unsafe.Platform. (Platform.java:56)
 at org.apache.spark.unsafe.array.ByteArrayMethods. 
(ByteArrayMethods.java:54)
 at org.apache.spark.internal.config.package$. (package.scala:1149)
 at org.apache.spark.SparkConf$. (SparkConf.scala:654)
 at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
{code}

Not sure if this is the case here but it seems that Java 17 will be more strict 
about uses of JDK Internals https://openjdk.java.net/jeps/403


> Adapt uses of JDK 17 Internal APIs (Unsafe, etc)
> 
>
> Key: SPARK-35557
> URL: https://issues.apache.org/jira/browse/SPARK-35557
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.2.0
>Reporter: Ismaël Mejía
>Priority: Major
>
> I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with 
> Spark 2.13 on Java 17 and I found this exception:
> {code:java}
> java.lang.ExceptionInInitializerError
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
> ...
> Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
> private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
> not "opens java.nio" to unnamed module @110df513
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:357)
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:297)
>  at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
>  at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
>  at org.apache.spark.unsafe.Platform. (Platform.java:56)
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
> {code}
> It seems that Java 17 will be more strict about uses of JDK Internals 
> [https://openjdk.java.net/jeps/403]



--
This message wa

[jira] [Updated] (SPARK-35557) Adapt uses of JDK 17 Internal APIs (Unsafe, etc)

2021-05-31 Thread Jira


 [ 
https://issues.apache.org/jira/browse/SPARK-35557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated SPARK-35557:
-
Summary: Adapt uses of JDK 17 Internal APIs (Unsafe, etc)  (was: Adapt uses 
of JDK Internal APIs (Unsafe, etc))

> Adapt uses of JDK 17 Internal APIs (Unsafe, etc)
> 
>
> Key: SPARK-35557
> URL: https://issues.apache.org/jira/browse/SPARK-35557
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.2.0
>Reporter: Ismaël Mejía
>Priority: Major
>
> I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with 
> Spark 2.13 on Java 17 and I found this exception:
> {code:borderStyle=solid}
> java.lang.ExceptionInInitializerError
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
> ...
> Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make 
> private java.nio.DirectByteBuffer(long,int) accessible: module java.base does 
> not "opens java.nio" to unnamed module @110df513
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:357)
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible 
> (AccessibleObject.java:297)
>  at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
>  at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
>  at org.apache.spark.unsafe.Platform. (Platform.java:56)
>  at org.apache.spark.unsafe.array.ByteArrayMethods. 
> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$. (package.scala:1149)
>  at org.apache.spark.SparkConf$. (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
> {code}
> Not sure if this is the case here but it seems that Java 17 will be more 
> strict about uses of JDK Internals https://openjdk.java.net/jeps/403



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org