[
https://issues.apache.org/jira/browse/SPARK-19552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15923773#comment-15923773
]
Virgil Palanciuc commented on SPARK-19552:
------------------------------------------
If this is going to be released in Spark 2.1.1, please make sure you upgrade to
4.1.9 final.
I've hit an issue where Spark would simply take forever to run - initially I
suspected a skewed join, but after some more investigation I noticed it's
stuck in {{io.netty.util.Recicler$Stack.scavengeSome}}, which lead me to this
bug: https://github.com/netty/netty/issues/6153
Apparently it's fixed in netty 4.0.43, but Spark 2.1.0 uses netty 4.0.42...
(the fix was cherry-picked in the netty 4.1 line, and is available since 4.1.9)
> Upgrade Netty version to 4.1.8 final
> ------------------------------------
>
> Key: SPARK-19552
> URL: https://issues.apache.org/jira/browse/SPARK-19552
> Project: Spark
> Issue Type: Improvement
> Components: Build
> Affects Versions: 2.1.0
> Reporter: Adam Roberts
> Priority: Minor
>
> Netty 4.1.8 was recently released but isn't API compatible with previous
> major versions (like Netty 4.0.x), see
> http://netty.io/news/2017/01/30/4-0-44-Final-4-1-8-Final.html for details.
> This version does include a fix for a security concern but not one we'd be
> exposed to with Spark "out of the box". Let's upgrade the version we use to
> be on the safe side as the security fix I'm especially interested in is not
> available in the 4.0.x release line.
> We should move up anyway to take on a bunch of other big fixes cited in the
> release notes (and if anyone were to use Spark with netty and tcnative, they
> shouldn't be exposed to the security problem) - we should be good citizens
> and make this change.
> As this 4.1 version involves API changes we'll need to implement a few
> methods and possibly adjust the Sasl tests. This JIRA and associated pull
> request starts the process which I'll work on - and any help would be much
> appreciated! Currently I know:
> {code}
> @Override
> public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise
> promise)
> throws Exception {
> if (!foundEncryptionHandler) {
> foundEncryptionHandler =
> ctx.channel().pipeline().get(encryptHandlerName) != null; <-- this
> returns false and causes test failures
> }
> ctx.write(msg, promise);
> }
> {code}
> Here's what changes will be required (at least):
> {code}
> common/network-common/src/main/java/org/apache/spark/network/crypto/TransportCipher.java{code}
> requires touch, retain and transferred methods
> {code}
> common/network-common/src/main/java/org/apache/spark/network/sasl/SaslEncryption.java{code}
> requires the above methods too
> {code}common/network-common/src/test/java/org/apache/spark/network/protocol/MessageWithHeaderSuite.java{code}
> With "dummy" implementations so we can at least compile and test, we'll see
> five new test failures to address.
> These are
> {code}
> org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption
> org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption
> org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]