Re: ADD_JARS doesn't properly work for spark-shell

2014-01-05 Thread Aureliano Buendia
On Sun, Jan 5, 2014 at 6:01 AM, Aaron Davidson ilike...@gmail.com wrote: That sounds like a different issue. What is the type of myrdd (i.e., if you just type myrdd into the shell)? It's possible it's defined as an RDD[Nothing] and thus all operations try to typecast to Nothing, which always

ADD_JARS doesn't properly work for spark-shell

2014-01-04 Thread Aureliano Buendia
Hi, I'm trying to access my stand alone spark app from spark-shell. I tried starting the shell by: MASTER=local[2] ADD_JARS=/path/to/my/jar ./spark-shell The log shows that the jar file was loaded. Also, I can access and create a new instance of mypackage.MyClass. The problem is that

Re: ADD_JARS doesn't properly work for spark-shell

2014-01-04 Thread Imran Rashid
actually, I think adding it to SPARK_CLASSPATH is exactly right. The exception is not on the executors, but in the driver -- its happening when the driver tries to read results that the executor is sending back to it. So the executors know about mypackage.MyClass, they happily run and send their

Re: ADD_JARS doesn't properly work for spark-shell

2014-01-04 Thread Aureliano Buendia
On Sun, Jan 5, 2014 at 2:28 AM, Aaron Davidson ilike...@gmail.com wrote: Additionally, which version of Spark are you running? 0.8.1. Unfortunately, this doesn't work either: MASTER=local[2] ADD_JARS=/path/to/my/jar SPARK_CLASSPATH=/path/to/my/jar./spark-shell On Sat, Jan 4, 2014 at

Re: ADD_JARS doesn't properly work for spark-shell

2014-01-04 Thread Aaron Davidson
Cool. To confirm, you said you can access the class and construct new objects -- did you do this in the shell itself (i.e., on the driver), or on the executors? Specifically, one of the following two should fail in the shell: new mypackage.MyClass() sc.parallelize(0 until 10, 2).foreach(_ = new

Re: ADD_JARS doesn't properly work for spark-shell

2014-01-04 Thread Aureliano Buendia
Sorry, I had a typo. I can conform that using ADD_JARS together with SPARK_CLASSPATH works as expected in spark-shell. It'd make sense to have the two combined as one option. On Sun, Jan 5, 2014 at 3:51 AM, Aaron Davidson ilike...@gmail.com wrote: Cool. To confirm, you said you can access the