Re: reduceByKey issue in example wordcount (scala)

2014-04-18 Thread Ian Bonnycastle
I just wanted to let you know, Marcelo, and others who may run into this error in the future... I figured it out! When I first started to work on my scripts, I was using "sbt/sbt package" followed by an "sbt/sbt run". But, when I saw "sbt/sbt run" show that it was compiling the script, I gave up o

Re: reduceByKey issue in example wordcount (scala)

2014-04-14 Thread Ian Bonnycastle
Hi Marcelo, Changing it to null didn't make any difference at all. /usr/local/pkg/spark is also on all the nodes... it has to be in order to get all the nodes up and running in the cluster. Also, I'm confused by what you mean with "That's most probably the class that implements the closure you're

Re: reduceByKey issue in example wordcount (scala)

2014-04-14 Thread Marcelo Vanzin
Hi Ian, On Mon, Apr 14, 2014 at 11:30 AM, Ian Bonnycastle wrote: > val sc = new SparkContext("spark://:7077", > "Simple App", "/usr/local/pkg/spark", > List("target/scala-2.10/simple-project_2.10-1.0.jar")) Hmmm... does /usr/local/pkg/spark exist on

Re: reduceByKey issue in example wordcount (scala)

2014-04-14 Thread Ian Bonnycastle
Hi Marcelo, thanks for answering. That didn't seem to help. I have the following now: val sc = new SparkContext("spark://:7077", "Simple App", "/usr/local/pkg/spark", List("target/scala-2.10/simple-project_2.10-1.0.jar")) sc.addJar("/home/spark/wor

Re: reduceByKey issue in example wordcount (scala)

2014-04-14 Thread Marcelo Vanzin
Hi Ian, When you run your packaged application, are you adding its jar file to the SparkContext (by calling the addJar() method)? That will distribute the code to all the worker nodes. The failure you're seeing seems to indicate the worker nodes do not have access to your code. On Mon, Apr 14, 2

reduceByKey issue in example wordcount (scala)

2014-04-14 Thread Ian Bonnycastle
Good afternoon, I'm attempting to get the wordcount example working, and I keep getting an error in the "reduceByKey(_ + _)" call. I've scoured the mailing lists, and haven't been able to find a sure fire solution, unless I'm missing something big. I did find something close, but it didn't appear