On Sun, Jan 5, 2014 at 6:01 AM, Aaron Davidson ilike...@gmail.com wrote:
That sounds like a different issue. What is the type of myrdd (i.e., if
you just type myrdd into the shell)? It's possible it's defined as an
RDD[Nothing] and thus all operations try to typecast to Nothing, which
always
Hi,
I'm trying to access my stand alone spark app from spark-shell. I tried
starting the shell by:
MASTER=local[2] ADD_JARS=/path/to/my/jar ./spark-shell
The log shows that the jar file was loaded. Also, I can access and create a
new instance of mypackage.MyClass.
The problem is that
actually, I think adding it to SPARK_CLASSPATH is exactly right. The
exception is not on the executors, but in the driver -- its happening when
the driver tries to read results that the executor is sending back to it.
So the executors know about mypackage.MyClass, they happily run and send
their
On Sun, Jan 5, 2014 at 2:28 AM, Aaron Davidson ilike...@gmail.com wrote:
Additionally, which version of Spark are you running?
0.8.1.
Unfortunately, this doesn't work either:
MASTER=local[2] ADD_JARS=/path/to/my/jar
SPARK_CLASSPATH=/path/to/my/jar./spark-shell
On Sat, Jan 4, 2014 at
Cool. To confirm, you said you can access the class and construct new
objects -- did you do this in the shell itself (i.e., on the driver), or on
the executors?
Specifically, one of the following two should fail in the shell:
new mypackage.MyClass()
sc.parallelize(0 until 10, 2).foreach(_ = new
Sorry, I had a typo. I can conform that using ADD_JARS together with
SPARK_CLASSPATH works as expected in spark-shell.
It'd make sense to have the two combined as one option.
On Sun, Jan 5, 2014 at 3:51 AM, Aaron Davidson ilike...@gmail.com wrote:
Cool. To confirm, you said you can access the