I figured out that Logging is a DeveloperApi and it should not be used outside
Spark code, so everything is fine now. Thanks again, Marcelo.
> On 24 Mar 2015, at 20:06, Marcelo Vanzin wrote:
>
> From the exception it seems like your app is also repackaging Scala
> classes somehow. Can you doubl
>From the exception it seems like your app is also repackaging Scala
classes somehow. Can you double check that and remove the Scala
classes from your app if they're there?
On Mon, Mar 23, 2015 at 10:07 PM, Alexey Zinoviev
wrote:
> Thanks Marcelo, this options solved the problem (I'm using 1.3.0)
Thanks Marcelo, this options solved the problem (I'm using 1.3.0), but it
works only if I remove "extends Logging" from the object, with "extends
Logging" it return:
Exception in thread "main" java.lang.LinkageError: loader constraint
violation in interface itable initialization: when resolving me
Thanks Ted, I'll try, hope there's no transitive dependencies on 3.2.10.
On Tue, Mar 24, 2015 at 4:21 AM, Ted Yu wrote:
> Looking at core/pom.xml :
>
> org.json4s
> json4s-jackson_${scala.binary.version}
> 3.2.10
>
>
> The version is hard coded.
>
> You can rebuild Sp
You could build a far jar for your application containing both your
code and the json4s library, and then run Spark with these two
options:
spark.driver.userClassPathFirst=true
spark.executor.userClassPathFirst=true
Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
that only
Looking at core/pom.xml :
org.json4s
json4s-jackson_${scala.binary.version}
3.2.10
The version is hard coded.
You can rebuild Spark 1.3.0 with json4s 3.2.11
Cheers
On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
wrote:
> Spark has a dependency on json4s 3.2.10, bu
Spark has a dependency on json4s 3.2.10, but this version has several bugs
and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
build.sbt and everything compiled fine. But when I spark-submit my JAR it
provides me with 3.2.10.
build.sbt
import sbt.Keys._
name := "sparkapp"
vers