Hi Ian,
When you run your packaged application, are you adding its jar file to
the SparkContext (by calling the addJar() method)?
That will distribute the code to all the worker nodes. The failure
you're seeing seems to indicate the worker nodes do not have access to
your code.
On Mon, Apr 14,
Hi Marcelo, thanks for answering.
That didn't seem to help. I have the following now:
val sc = new SparkContext(spark://masternodeip:7077,
Simple App, /usr/local/pkg/spark,
List(target/scala-2.10/simple-project_2.10-1.0.jar))
Hi Ian,
On Mon, Apr 14, 2014 at 11:30 AM, Ian Bonnycastle ibo...@gmail.com wrote:
val sc = new SparkContext(spark://masternodeip:7077,
Simple App, /usr/local/pkg/spark,
List(target/scala-2.10/simple-project_2.10-1.0.jar))
Hmmm... does
Hi Marcelo,
Changing it to null didn't make any difference at all. /usr/local/pkg/spark
is also on all the nodes... it has to be in order to get all the nodes up
and running in the cluster. Also, I'm confused by what you mean with
That's most probably the class that implements the closure you're