Hi,
I have trouble running some custom code on Spark 0.9.1 in standalone
mode on a cluster. I built a fat jar (excluding Spark) that I'm adding
to the classpath with ADD_JARS=... When I start the Spark shell, I can
instantiate classes, but when I run Spark code, I get strange
Hi Sebastian,
That exception generally means you have the class loaded by two
different class loaders, and some code is trying to mix instances
created by the two different loaded classes.
Do you happen to have that class both in the spark jars and in your
app's uber-jar? That might explain the