rednaxelafx commented on a change in pull request #28463:
URL: https://github.com/apache/spark/pull/28463#discussion_r422054048



##########
File path: core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala
##########
@@ -372,14 +342,64 @@ private[spark] object ClosureCleaner extends Logging {
 
       logDebug(s" +++ closure $func (${func.getClass.getName}) is now cleaned 
+++")
     } else {
-      logDebug(s"Cleaning lambda: ${lambdaFunc.get.getImplMethodName}")
+      val lambdaProxy = maybeIndylambdaProxy.get
+      val implMethodName = lambdaProxy.getImplMethodName
+
+      logDebug(s"Cleaning indylambda closure: $implMethodName")
+
+      // capturing class is the class that declared this lambda
+      val capturingClassName = lambdaProxy.getCapturingClass.replace('/', '.')
+      val classLoader = func.getClass.getClassLoader // this is the safest 
option
+      // scalastyle:off classforname
+      val capturingClass = Class.forName(capturingClassName, false, 
classLoader)
+      // scalastyle:on classforname
 
-      val captClass = 
Utils.classForName(lambdaFunc.get.getCapturingClass.replace('/', '.'),
-        initialize = false, noSparkClassLoader = true)
       // Fail fast if we detect return statements in closures
-      getClassReader(captClass)
-        .accept(new 
ReturnStatementFinder(Some(lambdaFunc.get.getImplMethodName)), 0)
-      logDebug(s" +++ Lambda closure (${lambdaFunc.get.getImplMethodName}) is 
now cleaned +++")
+      val capturingClassReader = getClassReader(capturingClass)
+      capturingClassReader.accept(new 
ReturnStatementFinder(Option(implMethodName)), 0)
+
+      val isClosureDeclaredInScalaRepl = 
capturingClassName.startsWith("$line") &&
+        capturingClassName.endsWith("$iw")
+      val outerThisOpt = if (lambdaProxy.getCapturedArgCount > 0) {
+        Option(lambdaProxy.getCapturedArg(0))
+      } else {
+        None
+      }
+
+      // only need to clean when there is an enclosing "this" captured by the 
closure, and it
+      // should be something cleanable, i.e. a Scala REPL line object
+      val needsCleaning = isClosureDeclaredInScalaRepl &&
+        outerThisOpt.isDefined && outerThisOpt.get.getClass.getName == 
capturingClassName
+
+      if (needsCleaning) {
+        // indylambda closures do not reference enclosing closures via an 
`$outer` chain, so no
+        // transitive cleaning on the `$outer` chain is needed.
+        // Thus clean() shouldn't be recursively called with a non-empty 
accessedFields.
+        assert(accessedFields.isEmpty)
+
+        initAccessedFields(accessedFields, Seq(capturingClass))
+        IndylambdaScalaClosures.findAccessedFields(
+          lambdaProxy, classLoader, accessedFields, cleanTransitively)
+
+        logDebug(s" + fields accessed by starting closure: 
${accessedFields.size} classes")
+        accessedFields.foreach { f => logDebug("     " + f) }
+
+        if (accessedFields(capturingClass).size < 
capturingClass.getDeclaredFields.length) {

Review comment:
       Rephrased reply:
   
   > Not `accessedFields(capturingClass).size` but 
`accessedFields.map(_._2.size).sum` here?
   
   Nope, my current version makes more sense here.
   
   Context:
   - indylambda Scala closures don't have a `$outer` chain for nested closures
   - Scala REPL line objects (as well as other Scala inner classes) still do 
have the `$outer` chain
   - Spark's `ClosureCleaner` is only supposed to be capable of cleaning 
old-style closure objects and REPL line objects, and nothing else
   - Since indylambda closure won't be on the `$outer` chain, the only possible 
outer object that Spark's `ClosureCleaner` can clean is a Scala REPL line 
object, i.e. `$iw` (which can have its own `$outer` chain of `$iw`s, but the 
old code didn't support cleaning that further chain of `$iw`s)
   - We make assumptions about Scala REPL line objects. One implicit assumption 
is that we know all its behavior, and that it shouldn't involve complex 
inheritance. c.f. 
https://gist.github.com/rednaxelafx/e9ecd09bbd1c448dbddad4f4edf25d48#closurecleanerisclosure-is-too-loose
   
   So processing super classes is unnecessary if the `isClosure` check is 
strict. It was too loose for the old style closures (via 
`ClosureCleaner.isClosure`, but the new check is fairly strict so it won't 
accidentally match non-closure classes.
   
   Thus there should actually be only one key in the `accessedFields` in this 
new code path, which is the enclosing REPL line object. And we don't need to 
care about its super class (it's just java.lang.Object, no fields).
   
   `accessedFields(capturingClass).size` checks the number of accessed fields 
directly declared on the REPL line object; 
`capturingClass.getDeclaredFields.length` reports the number of fields directly 
declared on the REPL line object. If we don't take super classes into account, 
this is the right check.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to