[ 
https://issues.apache.org/jira/browse/SPARK-21495?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xin Yu Pan updated SPARK-21495:
-------------------------------
    Description: 
We hit an issue when enabling authentication and Sasl encryption, see bold font 
in following parameter list.
spark.local.dir /tmp/xpan-spark-161
spark.eventLog.dir file:///home/xpan/spark-conf/event
spark.eventLog.enabled true
spark.history.fs.logDirectory file:/home/xpan/spark-conf/event
spark.history.ui.port 18085
spark.history.fs.cleaner.enabled true
spark.history.fs.cleaner.interval 1d
spark.history.fs.cleaner.maxAge 14d
spark.dynamicAllocation.enabled false
spark.shuffle.service.enabled false
spark.shuffle.service.port 7448
spark.shuffle.reduceLocality.enabled false
spark.master.port 7087
spark.master.rest.port 6077
spark.executor.extraJavaOptions -Djava.security.egd=file:/dev/./urandom
*spark.authenticate true
spark.authenticate.secret 5828d44b-f9b9-4033-b1f5-21d1e3273ec2
spark.authenticate.enableSaslEncryption false
spark.network.sasl.serverAlwaysEncrypt false*


We run the simple SparkPi example and there are Exception messages even though 
the application gets done.

# cat 
spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out
Spark Command: /opt/xpan/cws22-0713/jre/8.0.3.21/linux-x86_64/bin/java -cp 
/opt/xpan/spark-1.6.1-bin-hadoop2.6/conf/:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/opt/xpan/hadoop-2.8.0/etc/hadoop/
 -Xms2g -Xmx2g org.apache.spark.deploy.ExternalShuffleService
========================================
17/07/20 06:24:10 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
17/07/20 06:24:10 INFO spark.SecurityManager: Changing view acls to: xpan
17/07/20 06:24:10 INFO spark.SecurityManager: Changing modify acls to: xpan
17/07/20 06:24:10 INFO spark.SecurityManager: SecurityManager: authentication 
enabled; ui acls disabled; users with view permissions: Set(xpan); users with 
modify permissions: Set(xpan)
17/07/20 06:24:11 INFO deploy.ExternalShuffleService: Starting shuffle service 
on port 7448 with useSasl = true
# cat 
spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out.1
... ...
17/07/20 02:57:31 INFO deploy.ExternalShuffleService: Starting shuffle service 
on port 7448 with useSasl = true
17/07/20 02:58:04 INFO shuffle.ExternalShuffleBlockResolver: Registered 
executor AppExecId{appId=app-20170720025800-0000, execId=0} with 
ExecutorShuffleInfo{localDirs=[/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0],
 subDirsPerLocalDir=64, 
shuffleManager=org.apache.spark.shuffle.sort.SortShuffleManager}
17/07/20 02:58:11 INFO security.sasl: DIGEST41:Unmatched MACs
17/07/20 02:58:11 WARN server.TransportChannelHandler: Exception in connection 
from /172.29.10.77:50616
io.netty.handler.codec.DecoderException: javax.security.sasl.SaslException: 
DIGEST-MD5: Out of order sequencing of messages from server. Got: 125 Expected: 
123
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:99)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at 
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:785)
Caused by: javax.security.sasl.SaslException: DIGEST-MD5: Out of order 
sequencing of messages from server. Got: 125 Expected: 123
        at 
com.ibm.security.sasl.digest.DigestMD5Base$DigestPrivacy.unwrap(DigestMD5Base.java:1535)
        at 
com.ibm.security.sasl.digest.DigestMD5Base.unwrap(DigestMD5Base.java:231)
        at 
org.apache.spark.network.sasl.SparkSaslServer.unwrap(SparkSaslServer.java:149)
        at 
org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:127)
        at 
org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:102)
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
        ... 13 more
17/07/20 02:58:11 ERROR server.TransportRequestHandler: Error sending result 
ChunkFetchSuccess{streamChunkId=StreamChunkId{streamId=908084716000, 
chunkIndex=1}, 
buffer=FileSegmentManagedBuffer{file=/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0/0c/shuffle_0_17_0.data,
 offset=1893612, length=302981}} to /172.29.10.77:50616; closing connection
java.nio.channels.ClosedChannelException

  was:
We hit an issue when enabling authentication and Sasl encryption, see bold font 
in following parameter list.
spark.local.dir /tmp/xpan-spark-161
spark.eventLog.dir file:///home/xpan/spark-conf/event
spark.eventLog.enabled true
spark.history.fs.logDirectory file:/home/xpan/spark-conf/event
spark.history.ui.port 18085
spark.history.fs.cleaner.enabled true
spark.history.fs.cleaner.interval 1d
spark.history.fs.cleaner.maxAge 14d
spark.dynamicAllocation.enabled false
spark.shuffle.service.enabled false
spark.shuffle.service.port 7448
spark.shuffle.reduceLocality.enabled false
spark.master.port 7087
spark.master.rest.port 6077
spark.executor.extraJavaOptions -Djava.security.egd=file:/dev/./urandom
spark.authenticate true
spark.authenticate.secret 5828d44b-f9b9-4033-b1f5-21d1e3273ec2
spark.authenticate.enableSaslEncryption false
spark.network.sasl.serverAlwaysEncrypt false


We run the simple SparkPi example and there are Exception messages even though 
the application gets done.

# cat 
spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out
Spark Command: /opt/xpan/cws22-0713/jre/8.0.3.21/linux-x86_64/bin/java -cp 
/opt/xpan/spark-1.6.1-bin-hadoop2.6/conf/:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/opt/xpan/hadoop-2.8.0/etc/hadoop/
 -Xms2g -Xmx2g org.apache.spark.deploy.ExternalShuffleService
========================================
17/07/20 06:24:10 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
17/07/20 06:24:10 INFO spark.SecurityManager: Changing view acls to: xpan
17/07/20 06:24:10 INFO spark.SecurityManager: Changing modify acls to: xpan
17/07/20 06:24:10 INFO spark.SecurityManager: SecurityManager: authentication 
enabled; ui acls disabled; users with view permissions: Set(xpan); users with 
modify permissions: Set(xpan)
17/07/20 06:24:11 INFO deploy.ExternalShuffleService: Starting shuffle service 
on port 7448 with useSasl = true
# cat 
spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out.1
... ...
17/07/20 02:57:31 INFO deploy.ExternalShuffleService: Starting shuffle service 
on port 7448 with useSasl = true
17/07/20 02:58:04 INFO shuffle.ExternalShuffleBlockResolver: Registered 
executor AppExecId{appId=app-20170720025800-0000, execId=0} with 
ExecutorShuffleInfo{localDirs=[/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0],
 subDirsPerLocalDir=64, 
shuffleManager=org.apache.spark.shuffle.sort.SortShuffleManager}
17/07/20 02:58:11 INFO security.sasl: DIGEST41:Unmatched MACs
17/07/20 02:58:11 WARN server.TransportChannelHandler: Exception in connection 
from /172.29.10.77:50616
io.netty.handler.codec.DecoderException: javax.security.sasl.SaslException: 
DIGEST-MD5: Out of order sequencing of messages from server. Got: 125 Expected: 
123
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:99)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at 
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:785)
Caused by: javax.security.sasl.SaslException: DIGEST-MD5: Out of order 
sequencing of messages from server. Got: 125 Expected: 123
        at 
com.ibm.security.sasl.digest.DigestMD5Base$DigestPrivacy.unwrap(DigestMD5Base.java:1535)
        at 
com.ibm.security.sasl.digest.DigestMD5Base.unwrap(DigestMD5Base.java:231)
        at 
org.apache.spark.network.sasl.SparkSaslServer.unwrap(SparkSaslServer.java:149)
        at 
org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:127)
        at 
org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:102)
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
        ... 13 more
17/07/20 02:58:11 ERROR server.TransportRequestHandler: Error sending result 
ChunkFetchSuccess{streamChunkId=StreamChunkId{streamId=908084716000, 
chunkIndex=1}, 
buffer=FileSegmentManagedBuffer{file=/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0/0c/shuffle_0_17_0.data,
 offset=1893612, length=302981}} to /172.29.10.77:50616; closing connection
java.nio.channels.ClosedChannelException


> DIGEST-MD5: Out of order sequencing of messages from server
> -----------------------------------------------------------
>
>                 Key: SPARK-21495
>                 URL: https://issues.apache.org/jira/browse/SPARK-21495
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle, Spark Core
>    Affects Versions: 1.6.1
>         Environment: OS: RedHat 7.1 64bit
> Spark: 1.6.1
>            Reporter: Xin Yu Pan
>
> We hit an issue when enabling authentication and Sasl encryption, see bold 
> font in following parameter list.
> spark.local.dir /tmp/xpan-spark-161
> spark.eventLog.dir file:///home/xpan/spark-conf/event
> spark.eventLog.enabled true
> spark.history.fs.logDirectory file:/home/xpan/spark-conf/event
> spark.history.ui.port 18085
> spark.history.fs.cleaner.enabled true
> spark.history.fs.cleaner.interval 1d
> spark.history.fs.cleaner.maxAge 14d
> spark.dynamicAllocation.enabled false
> spark.shuffle.service.enabled false
> spark.shuffle.service.port 7448
> spark.shuffle.reduceLocality.enabled false
> spark.master.port 7087
> spark.master.rest.port 6077
> spark.executor.extraJavaOptions -Djava.security.egd=file:/dev/./urandom
> *spark.authenticate true
> spark.authenticate.secret 5828d44b-f9b9-4033-b1f5-21d1e3273ec2
> spark.authenticate.enableSaslEncryption false
> spark.network.sasl.serverAlwaysEncrypt false*
> We run the simple SparkPi example and there are Exception messages even 
> though the application gets done.
> # cat 
> spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out
> Spark Command: /opt/xpan/cws22-0713/jre/8.0.3.21/linux-x86_64/bin/java -cp 
> /opt/xpan/spark-1.6.1-bin-hadoop2.6/conf/:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/opt/xpan/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/opt/xpan/hadoop-2.8.0/etc/hadoop/
>  -Xms2g -Xmx2g org.apache.spark.deploy.ExternalShuffleService
> ========================================
> 17/07/20 06:24:10 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 17/07/20 06:24:10 INFO spark.SecurityManager: Changing view acls to: xpan
> 17/07/20 06:24:10 INFO spark.SecurityManager: Changing modify acls to: xpan
> 17/07/20 06:24:10 INFO spark.SecurityManager: SecurityManager: authentication 
> enabled; ui acls disabled; users with view permissions: Set(xpan); users with 
> modify permissions: Set(xpan)
> 17/07/20 06:24:11 INFO deploy.ExternalShuffleService: Starting shuffle 
> service on port 7448 with useSasl = true
> # cat 
> spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out.1
> ... ...
> 17/07/20 02:57:31 INFO deploy.ExternalShuffleService: Starting shuffle 
> service on port 7448 with useSasl = true
> 17/07/20 02:58:04 INFO shuffle.ExternalShuffleBlockResolver: Registered 
> executor AppExecId{appId=app-20170720025800-0000, execId=0} with 
> ExecutorShuffleInfo{localDirs=[/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0],
>  subDirsPerLocalDir=64, 
> shuffleManager=org.apache.spark.shuffle.sort.SortShuffleManager}
> 17/07/20 02:58:11 INFO security.sasl: DIGEST41:Unmatched MACs
> 17/07/20 02:58:11 WARN server.TransportChannelHandler: Exception in 
> connection from /172.29.10.77:50616
> io.netty.handler.codec.DecoderException: javax.security.sasl.SaslException: 
> DIGEST-MD5: Out of order sequencing of messages from server. Got: 125 
> Expected: 123
>       at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:99)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>       at 
> org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>       at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
>       at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>       at java.lang.Thread.run(Thread.java:785)
> Caused by: javax.security.sasl.SaslException: DIGEST-MD5: Out of order 
> sequencing of messages from server. Got: 125 Expected: 123
>       at 
> com.ibm.security.sasl.digest.DigestMD5Base$DigestPrivacy.unwrap(DigestMD5Base.java:1535)
>       at 
> com.ibm.security.sasl.digest.DigestMD5Base.unwrap(DigestMD5Base.java:231)
>       at 
> org.apache.spark.network.sasl.SparkSaslServer.unwrap(SparkSaslServer.java:149)
>       at 
> org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:127)
>       at 
> org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:102)
>       at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
>       ... 13 more
> 17/07/20 02:58:11 ERROR server.TransportRequestHandler: Error sending result 
> ChunkFetchSuccess{streamChunkId=StreamChunkId{streamId=908084716000, 
> chunkIndex=1}, 
> buffer=FileSegmentManagedBuffer{file=/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0/0c/shuffle_0_17_0.data,
>  offset=1893612, length=302981}} to /172.29.10.77:50616; closing connection
> java.nio.channels.ClosedChannelException



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to