Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9825#discussion_r45391437
  
    --- Diff: 
repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkIMain.scala ---
    @@ -1221,10 +1221,16 @@ import org.apache.spark.annotation.DeveloperApi
             )
           }
     
    -      val preamble = """
    -        |class %s extends Serializable {
    -        |  %s%s%s
    -      """.stripMargin.format(lineRep.readName, envLines.map("  " + _ + 
";\n").mkString, importsPreamble, indentCode(toCompute))
    +      val preamble = s"""
    +        |class ${lineRep.readName} extends Serializable {
    +        |  ${envLines.map("  " + _ + ";\n").mkString}
    +        |  $importsPreamble
    +        |
    +        |  // If we need to construct any objects defined in the REPL on 
an executor we will need
    +        |  // to pass the outer scope to the appropriate encoder.
    +        |  
org.apache.spark.sql.catalyst.encoders.OuterScopes.addOuterScope(this)
    --- End diff --
    
    We need a handle to any outer class that defines an inner class that is 
going to used in a Spark Dataset so that we can construct new instances on the 
executors.  It might be helpful to also look at the changes in #9602.
    
    @dragos was saying maybe we don't have this problem in 2.11, but I have not 
investigated at all.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to