Hi,
I have trouble running some custom code on Spark 0.9.1 in standalone
mode on a cluster. I built a fat jar (excluding Spark) that I'm adding
to the classpath with ADD_JARS=... When I start the Spark shell, I can
instantiate classes, but when I run Spark code, I get strange
ClassCastExceptions like this:
14/05/29 14:48:10 INFO TaskSetManager: Loss was due to
java.lang.ClassCastException: io.ssc.sampling.matrix.DenseBlock cannot
be cast to io.ssc.sampling.matrix.DenseBlock [duplicate 1]
What am I doing wrong?
Thx,
Sebastian