I've been trying to secure block data transferred by HDFS. I added below to 
hdfs-site.xml and core-site xml to the data node and name node and restart both.


<property>
  <name>dfs.encrypt.data.transfer</name>
  <value>true</value>
</property>

<property>
  <name>hadoop.rpc.protection</name>
  <value>privacy</value>
</property>


When I try to put a file from the hdfs command line shell, the operation fails 
with "connection is reset" and I see following from the datanode log:

"org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected 
encryption handshake from client at /172.31.36.56:48271. Perhaps the client is 
running an older version of Hadoop which does not support encryption"


I am able to reproduce this on two different deployments. I was following 
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/SecureMode.html#Authentication,
 but didn't turn on kerberos authentication. No authentication works in my 
environment. Can this be the reason the handshake fails?

Any help is appreciated.

Thanks,

Lin Zhao

Reply via email to