[
https://issues.apache.org/jira/browse/SPARK-12807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15102920#comment-15102920
]
Steve Loughran commented on SPARK-12807:
----------------------------------------
One thing to think about here is ramping up a notch and shading all the
downstream dependencies in the YARN shuffle JAR.
This is a JAR designed to be used in a specific place, the classpath. It now
includes: netty, leveldb, some bits of com.google (in 1.6), some
javax.annotation.
What is also has for extra fun is a leveldb jni.so in native, as well as a
netty one. This is going to be a problem; unless you can somehow isolate and
shade that this shuffle JAR is going to force in a specific leveldb version on
every bit of code picking up this JAR.
> Spark External Shuffle not working in Hadoop clusters with Jackson 2.2.3
> ------------------------------------------------------------------------
>
> Key: SPARK-12807
> URL: https://issues.apache.org/jira/browse/SPARK-12807
> Project: Spark
> Issue Type: Bug
> Components: Shuffle, YARN
> Affects Versions: 1.6.0
> Environment: A Hadoop cluster with Jackson 2.2.3, spark running with
> dynamic allocation enabled
> Reporter: Steve Loughran
> Priority: Critical
>
> When you try to try to use dynamic allocation on a Hadoop 2.6-based cluster,
> you get to see a stack trace in the NM logs, indicating a jackson 2.x version
> mismatch.
> (reported on the spark dev list)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]