you should have just tried it and let us know what your experience had
been! Anyways, after spending long hours on this problem I realized this is
actually a classLoader problem.
If you use spark-submit this exception should go away but you haven't told
us how you are submitting a Job such that
Hi,
I am currently running this code from my IDE(Eclipse). I tried adding the
scope "provided" to the dependency without any effect. Should I build this
and submit it using the spark-submit command?
Thanks
Vaibhav
On 11 October 2016 at 04:36, Jakob Odersky wrote:
> Just
Just thought of another potential issue: you should use the "provided"
scope when depending on spark. I.e in your project's pom:
org.apache.spark
spark-core_2.11
2.0.1
provided
On Mon, Oct 10, 2016 at 2:00 PM, Jakob Odersky
Ho do you submit the application? A version mismatch between the launcher,
driver and workers could lead to the bug you're seeing. A common reason for
a mismatch is if the SPARK_HOME environment variable is set. This will
cause the spark-submit script to use the launcher determined by that