[
https://issues.apache.org/jira/browse/SPARK-12807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15096826#comment-15096826
]
Steve Loughran commented on SPARK-12807:
----------------------------------------
Jackson versioning is really a symptom of a greater problem: lack of classpath
isolation in YARN aux services.
Fixing CP isolation there YARN-1573 is the best option for Hadoop 2.9+; forked
JVMs even better as you get failure isolation.
Short term (support for Hadoop <= 2.8), I don't know.
I'm now confused about what's happening here —as in "why hasn't this problem
surfaced before"
> Spark External Shuffle not working in Hadoop clusters with Jackson 2.2.3
> ------------------------------------------------------------------------
>
> Key: SPARK-12807
> URL: https://issues.apache.org/jira/browse/SPARK-12807
> Project: Spark
> Issue Type: Bug
> Components: Shuffle, YARN
> Affects Versions: 1.6.0
> Environment: A Hadoop cluster with Jackson 2.2.3, spark running with
> dynamic allocation enabled
> Reporter: Steve Loughran
>
> When you try to try to use dynamic allocation on a Hadoop 2.6-based cluster,
> you get to see a stack trace in the NM logs, indicating a jackson 2.x version
> mismatch.
> (reported on the spark dev list)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]