[ 
https://issues.apache.org/jira/browse/HADOOP-11329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14228610#comment-14228610
 ] 

Allen Wittenauer commented on HADOOP-11329:
-------------------------------------------

bq. KMS can be deployed totally independent of Hadoop 

Given that the stack trace is passing through Hadoop common, clearly KMS has 
requirements on Hadoop being installed on the same node.  So this statement 
comes across as misleading.  That said, I don't think adding another 
configuration variable in lieu of just using the standard ones that Hadoop 
common expects is really the proper fix here.  (In particular, 
$HADOOP_PREFIX/$HADOOP_COMMON_LIB_NATIVE_DIR.)

It seems as though the simplest fix until the KMS shell code can be brought in 
line with the rest of the shell code (HADOOP-10788), would be to just populate 
hadoop.home.dir based upon a calculation of the $KMS_HOME  if 
$HADOOP_PREFIX/$HADOOP_COMMON_LIB_NATIVE_DIR is empty.

> should add HADOOP_HOME as part of kms's startup options
> -------------------------------------------------------
>
>                 Key: HADOOP-11329
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11329
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: kms, security
>            Reporter: Dian Fu
>            Assignee: Arun Suresh
>         Attachments: HADOOP-11329.1.patch, HADOOP-11329.2.patch, 
> HADOOP-11329.3.patch, HADOOP-11329.4.patch
>
>
> Currently, HADOOP_HOME isn't part of the start up options of KMS. If I add 
> the the following configuration to core-site.xml of kms,
> {code} <property>
>   <name>hadoop.security.crypto.codec.classes.aes.ctr.nopadding</name>
>   <value>org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec</value>
>  </property>
> {code} kms server will throw the following exception when receive 
> "generateEncryptedKey" request
> {code}
> 2014-11-24 10:23:18,189 DEBUG org.apache.hadoop.crypto.OpensslCipher: Failed 
> to load OpenSSL Cipher.
> java.lang.UnsatisfiedLinkError: 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z
>         at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method)
>         at 
> org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:85)
>         at 
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>         at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
>         at 
> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:67)
>         at 
> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:100)
>         at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension$DefaultCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:256)
>         at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
>         at 
> org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension$EncryptedQueueRefiller.fillQueueForKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:77)
>         at 
> org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
>         at 
> org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
>         at 
> com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
>         at 
> com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
>         at 
> com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
>         at 
> com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
>         at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
>         at 
> com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
>         at 
> org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:256)
>         at 
> org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
>         at 
> org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension.generateEncryptedKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:126)
>         at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
>         at 
> org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider.generateEncryptedKey(KeyAuthorizationKeyProvider.java:192)
>         at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:379)
>         at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:375
> {code}
> The reason is that it cannot find libhadoop.so. This will prevent KMS to 
> response to "generateEncryptedKey" requests.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to