[ 
https://issues.apache.org/jira/browse/SPARK-12277?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-12277:
---------------------------------
    Labels: bulk-closed  (was: )

> Use sparkIMain to compile and interpret  string throw  
> java.lang.ClassNotFoundException.
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-12277
>                 URL: https://issues.apache.org/jira/browse/SPARK-12277
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.1
>            Reporter: emma001
>            Priority: Major
>              Labels: bulk-closed
>
> The test code as follow:
> val scalaInstance = new SparkIMain()
>  val ss = "object ScalaScript{ def 
> execute(df1:org.apache.spark.sql.DataFrame,df2:org.apache.spark.sql.DataFrame):
>  org.apache.spark.sql.DataFrame ={\n val tt = 
> df2.rdd.map(t=>t(0)).collect()\n df1 \n}}"
> scalaInstance.compileString(ss )
> scalaInstance.bind("df1","org.apache.spark.sql.DataFrame",df1)
> scalaInstance.bind("df2","org.apache.spark.sql.DataFrame",df2)
> scalaInstance.interpret("val t= ScalaScript.execute(df1,df2)")
> ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
> java.lang.ClassNotFoundException: ScalaScript$$anonfun$1
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:348)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:95)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58)
>       at org.apache.spark.scheduler.Task.run(Task.scala:70)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to