I'm using spark-ec2 to run some Spark code.  When I set master to
"local", then it runs fine.  However, when I set master to $MASTER,
the workers immediately fail, with java.lang.NoClassDefFoundError for
the classes.

I've used sbt-assembly to make a jar with the classes, confirmed using
jar tvf that the classes are there, and set SparkConf to distribute
the classes.  The Spark Web UI indeed shows the assembly jar to be
added to the classpath:
http://172.x.x.x47441/jars/myjar-assembly-1.0.jar

It seems that, despite the fact that myjar-assembly contains the
class, and is being added to the cluster, it's not reaching the
workers.  How do I fix this? (Do I need to manually copy the jar file?
If so, to which dir? I thought that the point of the SparkConf add
jars was to do this automatically)

Reply via email to