Dear devs,
I'v got stuck on this issue for several days, and I need help now.
At the first, I run into an old issue, which is the same as
http://apache-spark-developers-list.1001551.n3.nabble.com/test-cases-stuck-on-quot-local-cluster-mode-quot-of-ReplSuite-td3086.html
<http://apache-spark-developers-list.1001551.n3.nabble.com/test-cases-stuck-on-quot-local-cluster-mode-quot-of-ReplSuite-td3086.html>
So, I check my assembly jar, and add the assembly jar to dependencies of
core project
(I run unit test within this sub project), and I set the SPARK_HOME(even
though, I do not
have a wrong SPARK_HOME before).
After that, unit test with local cluster mode will not be blocking all the
time, but throws a *ClassNotFoundException*, such as:
Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most
recent failure:
Lost task 1.3 in stage 0.0 (TID 5, localhost, executor 3):
java.lang.ClassNotFoundException:
org.apache.spark.broadcast.BroadcastSuite$$anonfun$15$$anonfun$16
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
...
Driver stacktrace:
org.apache.spark.SparkException: Job aborted due to stage failure:
Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in
stage 0.0 (TID 5, localhost, executor 3): java.lang.ClassNotFoundException:
org.apache.spark.broadcast.BroadcastSuite$$anonfun$15$$anonfun$16
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
...
And then, I tried rebuilt the whole spark project or core project,
build/test the core project,
add the 'spark.driver.extraClassPath/spark.executor.extraClassPath' param
and so on but, all failed.
Maybe I miss something when I try to run unit test with *local cluster* in
*Intellij IDEA*.
I'd appreciate a lot if any guys could give me a hint. THANKS.
Best wishes.
wuyi
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: [email protected]