[ 
https://issues.apache.org/jira/browse/SPARK-4738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15195671#comment-15195671
 ] 

Stan commented on SPARK-4738:
-----------------------------

For what it worths I have found a workaround by putting a later version of 
spark-assembly directly in /opt/ and use the following flag to spark-submit:

{{--conf 
'spark.yarn.jar=local:/opt/cloudera/parcels/CDH/jars/spark-assembly-1.5.2-hadoop2.2.0.jar'}}

> Update the netty-3.x version in spark-assembly-*.jar
> ----------------------------------------------------
>
>                 Key: SPARK-4738
>                 URL: https://issues.apache.org/jira/browse/SPARK-4738
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>    Affects Versions: 1.1.0
>            Reporter: Tobias Pfeiffer
>            Priority: Minor
>
> It seems as if the version of akka-remote (2.2.3-shaded-protobuf) that is 
> bundled in the spark-assembly-1.1.1-hadoop2.4.0.jar file pulls in an ancient 
> version of netty, namely io.netty:netty:3.6.6.Final (using the package 
> org.jboss.netty). This means that when using spark-submit, there will always 
> be this netty version on the classpath before any versions added by the user. 
> This may lead to issues with other packages that depend on newer versions and 
> may fail with java.lang.NoSuchMethodError etc.(finagle-http in my case).
> I wonder if it possible to manually include a newer netty version, like 
> netty-3.8.0.Final.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to