sarutak opened a new pull request, #52956:
URL: https://github.com/apache/spark/pull/52956

   ### What changes were proposed in this pull request?
   This PR proposes to change `ClosureClenaer` to work with Java 22+.
   Current `ClosureCleaner` doesn't work with Java 22. For example, the 
following code fails.
   ```
   val x = 100
   sc.parallelize(1 to 10).map(v => v + x).collect
   java.lang.InternalError: java.lang.IllegalAccessException: final field has 
no write access: $Lambda/0x00001c0001bae838.arg$1/putField, from class 
java.lang.Object (module java.base)
     at 
java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:207)
     at 
java.base/jdk.internal.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:144)
     at 
java.base/java.lang.reflect.Field.acquireOverrideFieldAccessor(Field.java:1200)
     at 
java.base/java.lang.reflect.Field.getOverrideFieldAccessor(Field.java:1169)
     at java.base/java.lang.reflect.Field.set(Field.java:836)
     at 
org.apache.spark.util.ClosureCleaner$.setFieldAndIgnoreModifiers(ClosureCleaner.scala:563)
     at 
org.apache.spark.util.ClosureCleaner$.cleanupScalaReplClosure(ClosureCleaner.scala:431)
     at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:256)
     at 
org.apache.spark.util.SparkClosureCleaner$.clean(SparkClosureCleaner.scala:39)
     at org.apache.spark.SparkContext.clean(SparkContext.scala:2844)
     at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:425)
     at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
     at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
     at org.apache.spark.rdd.RDD.withScope(RDD.scala:417)
     at org.apache.spark.rdd.RDD.map(RDD.scala:424)
     ... 38 elided
   Caused by: java.lang.IllegalAccessException: final field has no write 
access: $Lambda/0x00001c0001bae838.arg$1/putField, from class java.lang.Object 
(module java.base)
     at 
java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:889)
     at 
java.base/java.lang.invoke.MethodHandles$Lookup.unreflectField(MethodHandles.java:3609)
     at 
java.base/java.lang.invoke.MethodHandles$Lookup.unreflectSetter(MethodHandles.java:3600)
     at 
java.base/java.lang.invoke.MethodHandleImpl$1.unreflectField(MethodHandleImpl.java:1619)
     at 
java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:185)
     ... 52 more
   ```
   
   The reason is that as of Java 22, final fields cannot be modified even if 
using reflection by [JEP416](https://openjdk.org/jeps/416).
   The current `ClosureCleaner` tries to modify a final field `arg$1` with a 
cloned and cleaned object so this part will fail.
   
   At first I considered two solutions:
   
   1. Using Unsafe API
   2. Using `--enable-final-field-mutation` option which is expected to be 
introduced by [JEP 500](https://openjdk.org/jeps/500)
   
   But either of them cannot resolve the issue because final fields of hidden 
classes cannot be modified and lambdas created by JVM internally using 
`invokedynamic` instruction are hidden classes (let's call such lambda indy 
lambda).
   
   So the solution this PR proposes is creating a non-indy lambda class by 
converting an indy lambda class using ASM.
   
   ### Why are the changes needed?
   To make Spark work with Java 22+.
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   ### How was this patch tested?
   TBD
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to