I frequently encounter problems building Spark as a dependency in java
projects because of version conflicts with other dependencies. Usually there
will be two different versions of a library and we'll see an
AbstractMethodError or invalid signature etc.

So far, I've seen it happen with jackson, slf4j, netty, jetty, javax.servlet
and maybe a few others. Even the spark-jobserver project excludes the netty
dependencies when including spark. These errors are always a pain to debug
and workaround - in one particularly nasty conflict with jackson 1.x on
spark and our codebase, we ended up just migrating to jackson 2.

I see shading is already setup for jetty and guava in the spark POM so
perhaps shading the others wouldn't be too difficult. Is there any reason
all of these dependencies aren't shaded in the spark POM? Seems like a good
idea to me.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Why-are-all-spark-deps-not-shaded-to-avoid-dependency-hell-tp13091.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to