[
https://issues.apache.org/jira/browse/HADOOP-13694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15664565#comment-15664565
]
ASF GitHub Bot commented on HADOOP-13694:
-----------------------------------------
Github user karth295 commented on a diff in the pull request:
https://github.com/apache/hadoop/pull/135#discussion_r87856607
--- Diff:
hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c
---
@@ -201,12 +209,22 @@ JNIEXPORT jlong JNICALL
Java_org_apache_hadoop_crypto_OpensslCipher_init
{
int jKeyLen = (*env)->GetArrayLength(env, key);
int jIvLen = (*env)->GetArrayLength(env, iv);
- if (jKeyLen != KEY_LENGTH_128 && jKeyLen != KEY_LENGTH_256) {
- THROW(env, "java/lang/IllegalArgumentException", "Invalid key
length.");
+ if (jKeyLen != KEY_LENGTH_128 && jKeyLen != KEY_LENGTH_192 && jKeyLen !=
KEY_LENGTH_256) {
+ char* keyLenErrMsg;
+ if (asprintf(&keyLenErrMsg, "Invalid key length: %d bytes", jKeyLen) <
0) {
+ THROW(env, "java/lang/IllegalArgumentException", "Invalid key
length");
+ } else {
+ THROW(env, "java/lang/IllegalArgumentException", keyLenErrMsg);
+ }
return (jlong)0;
}
if (jIvLen != IV_LENGTH) {
- THROW(env, "java/lang/IllegalArgumentException", "Invalid iv length.");
+ char* ivLenErrMsg;
+ if (asprintf(&ivLenErrMsg, "Invalid iv length: %d bytes", jIvLen) < 0)
{
+ THROW(env, "java/lang/IllegalArgumentException", "Invalid iv
length.");
+ } else {
+ THROW(env, "java/lang/IllegalArgumentException", ivLenErrMsg);
--- End diff --
Same here -- you need to free `ivLenErrMsg`
> Data transfer encryption with AES 192: Invalid key length.
> ----------------------------------------------------------
>
> Key: HADOOP-13694
> URL: https://issues.apache.org/jira/browse/HADOOP-13694
> Project: Hadoop Common
> Issue Type: Improvement
> Components: security
> Affects Versions: 2.7.2
> Environment: OS: Ubuntu 14.04
> /hadoop-2.7.2/bin$ uname -a
> Linux wkstn-kpalaniappan 3.13.0-79-generic #123-Ubuntu SMP Fri Feb 19
> 14:27:58 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
> /hadoop-2.7.2/bin$ java -version
> java version "1.7.0_95"
> OpenJDK Runtime Environment (IcedTea 2.6.4) (7u95-2.6.4-0ubuntu0.14.04.1)
> OpenJDK 64-Bit Server VM (build 24.95-b01, mixed mode)
> Hadoop version: 2.7.2
> Reporter: Karthik Palaniappan
> Assignee: Harsh J
>
> Configuring aes 128 or aes 256 encryption
> (dfs.encrypt.data.transfer.cipher.key.bitlength = [128, 256]) works perfectly
> fine. Trying to use AES 192 generates this exception on the datanode:
> 16/02/29 17:34:10 ERROR datanode.DataNode:
> wkstn-kpalaniappan:50010:DataXceiver error processing unknown operation src:
> /127.0.0.1:57237 dst: /127.0.0.1:50010
> java.lang.IllegalArgumentException: Invalid key length.
> at org.apache.hadoop.crypto.OpensslCipher.init(Native Method)
> at org.apache.hadoop.crypto.OpensslCipher.init(OpensslCipher.java:176)
> at
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec$OpensslAesCtrCipher.init(OpensslAesCtrCryptoCodec.java:116)
> at
> org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290)
> at
> org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303)
> at
> org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:128)
> at
> org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:109)
> at
> org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:133)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:396)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:110)
> at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:193)
> at java.lang.Thread.run(Thread.java:745)
> And this exception on the client:
> /hadoop-2.7.2/bin$ ./hdfs dfs -copyFromLocal ~/.vimrc /vimrc
> 16/02/29 17:34:10 WARN hdfs.DFSClient: DataStreamer Exception
> java.lang.IllegalArgumentException: Invalid key length.
> at org.apache.hadoop.crypto.OpensslCipher.init(Native Method)
> at org.apache.hadoop.crypto.OpensslCipher.init(OpensslCipher.java:176)
> at
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec$OpensslAesCtrCipher.init(OpensslAesCtrCryptoCodec.java:116)
> at
> org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290)
> at
> org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303)
> at
> org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:128)
> at
> org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:109)
> at
> org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:133)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:299)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:242)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:211)
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183)
> at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1318)
> at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266)
> at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
> copyFromLocal: DataStreamer Exception:
> The issue is in the openssl c code:
> https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c#L204.
> It asserts that the key length is 128 or 256 bits, but does not allow 192.
> Multiple hadoop documents say that
> dfs.encrypt.data.transfer.cipher.key.bitlength can be set to 128, 192, or
> 256:
> https://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/SecureMode.html,
>
> https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml.
> Are these documents wrong? Or is it an environment-specific issue?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]