[ 
https://issues.apache.org/jira/browse/SPARK-19552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16134657#comment-16134657
 ] 

Andrew Ash commented on SPARK-19552:
------------------------------------

Heads up the next time someone attempts this:

Upgrading to 4.1.x causes a few Apache Arrow-related integration test failures 
in Spark, because Arrow depends on a part of Netty 4.0.x that has changed in 
the 4.1.x series. I've been running with Netty 4.1.x on my fork for a few 
months but due to recent Arrow changes will now have to downgrade back to Netty 
4.0.x due to this Arrow dependency.  More details at 
https://github.com/palantir/spark/pull/247#issuecomment-323469174

So when Spark does go to 4.1.x we might need Arrow to bump its dependencies as 
well (or shade netty in Spark at the same time as Sean suggests, which I 
generally support).

> Upgrade Netty version to 4.1.8 final
> ------------------------------------
>
>                 Key: SPARK-19552
>                 URL: https://issues.apache.org/jira/browse/SPARK-19552
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.1.0
>            Reporter: Adam Roberts
>            Priority: Minor
>
> Netty 4.1.8 was recently released but isn't API compatible with previous 
> major versions (like Netty 4.0.x), see 
> http://netty.io/news/2017/01/30/4-0-44-Final-4-1-8-Final.html for details.
> This version does include a fix for a security concern but not one we'd be 
> exposed to with Spark "out of the box". Let's upgrade the version we use to 
> be on the safe side as the security fix I'm especially interested in is not 
> available in the 4.0.x release line. 
> We should move up anyway to take on a bunch of other big fixes cited in the 
> release notes (and if anyone were to use Spark with netty and tcnative, they 
> shouldn't be exposed to the security problem) - we should be good citizens 
> and make this change.
> As this 4.1 version involves API changes we'll need to implement a few 
> methods and possibly adjust the Sasl tests. This JIRA and associated pull 
> request starts the process which I'll work on - and any help would be much 
> appreciated! Currently I know:
> {code}
> @Override
> public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise 
> promise)
>       throws Exception {
>       if (!foundEncryptionHandler) {
>         foundEncryptionHandler =
>           ctx.channel().pipeline().get(encryptHandlerName) != null; <-- this 
> returns false and causes test failures
>       }
>       ctx.write(msg, promise);
>     }
> {code}
> Here's what changes will be required (at least):
> {code}
> common/network-common/src/main/java/org/apache/spark/network/crypto/TransportCipher.java{code}
>  requires touch, retain and transferred methods
> {code}
> common/network-common/src/main/java/org/apache/spark/network/sasl/SaslEncryption.java{code}
>  requires the above methods too
> {code}common/network-common/src/test/java/org/apache/spark/network/protocol/MessageWithHeaderSuite.java{code}
> With "dummy" implementations so we can at least compile and test, we'll see 
> five new test failures to address.
> These are
> {code}
> org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption
> org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption
> org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to