Musty,
Thanks for the response. I saw the same issue on two environments, one
versioned 2.6.0-cdh5.4.7, rcf5ade14581a102afdf8b7689b31ef225e7362fc and the
other 2.7.1, r15ecc87ccf4a0228f35af08fc56de536e6ce657a.
I've tried different combinations of the algorithm/key length. Latest I've
tried is:
<property>
<name>dfs.encrypt.data.transfer</name>
<value>true</value>
</property>
<property>
<name>dfs.encrypt.data.transfer.algorithm</name>
<value>3des</value>
</property>
<property>
<name>dfs.encrypt.data.transfer.cipher.suites</name>
<value>AES/CTR/NoPadding</value>
</property>
<property>
<name>dfs.encrypt.data.transfer.cipher.key.bitlength</name>
<value>128</value>
</property>
From: Musty Rehmani <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Date: Wednesday, April 6, 2016 at 3:54 PM
To: Lin Zhao <[email protected]<mailto:[email protected]>>,
"[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: Re: Is it possible to turn on data node encryption without kerberos?
Kerberos is used to authenticate user or service principal to grant access to
cluster. It doesn't encrypt data blocks coming in and out of cluster.
Sent from Yahoo Mail on
Android<https://overview.mail.yahoo.com/mobile/?.src=Android>
On Wed, Apr 6, 2016 at 4:36 PM, Lin Zhao
<[email protected]<mailto:[email protected]>> wrote:
I've been trying to secure block data transferred by HDFS. I added below to
hdfs-site.xml and core-site xml to the data node and name node and restart both.
<property>
<name>dfs.encrypt.data.transfer</name>
<value>true</value>
</property>
<property>
<name>hadoop.rpc.protection</name>
<value>privacy</value>
</property>
When I try to put a file from the hdfs command line shell, the operation fails
with "connection is reset" and I see following from the datanode log:
"org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected
encryption handshake from client at /172.31.36.56:48271. Perhaps the client is
running an older version of Hadoop which does not support encryption"
I am able to reproduce this on two different deployments. I was following
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/SecureMode.html#Authentication,
but didn't turn on kerberos authentication. No authentication works in my
environment. Can this be the reason the handshake fails?
Any help is appreciated.
Thanks,
Lin Zhao