[
https://issues.apache.org/jira/browse/HADOOP-14333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15977751#comment-15977751
]
Andrew Wang commented on HADOOP-14333:
--------------------------------------
I took a look at Hive's usage of DFSClient. They're digging into our private
bits to get access to the KeyProvider, which they use to get key metadata and
also to create keys, delete keys, etc.
This seems pretty easy to handle in a supportable way by adding a new HdfsAdmin
API for {{getKeyProvider}}. The issue is that the Hive team never told us they
needed access to a KP, so we were unaware of this need.
We're already swallowing this exception in e.g. {{getTrashRoot}} for
compatibility reasons, so the question is really whether we swallow it in
DFSClient vs. in DFS. This doesn't seem worth making a stink over, when the
impact is that existing versions of Hive won't work with 2.8.1 or 3.0.0-alpha3
HDFS clients.
My proposal:
* We move swallowing the exception back to DFSClient for now
* Add whatever APIs Hive requires to HdfsAdmin, get them to use these new APIs
* Clean things up in HDFS as we want
> New exception thrown by (private) DFSClient API isHDFSEncryptionEnabled broke
> hacky hive code
> ----------------------------------------------------------------------------------------------
>
> Key: HADOOP-14333
> URL: https://issues.apache.org/jira/browse/HADOOP-14333
> Project: Hadoop Common
> Issue Type: Bug
> Affects Versions: 2.8.1, 3.0.0-alpha3
> Reporter: Yongjun Zhang
> Assignee: Yongjun Zhang
> Attachments: HADOOP-14333.001.patch
>
>
> Though Hive should be fixed not to access DFSClient which is private to
> HADOOP, removing the throws added by HADOOP-14104 is a quicker solution to
> unblock hive.
> Hive code
> {code}
> private boolean isEncryptionEnabled(DFSClient client, Configuration conf) {
> try {
> DFSClient.class.getMethod("isHDFSEncryptionEnabled");
> } catch (NoSuchMethodException e) {
> // the method is available since Hadoop-2.7.1
> // if we run with an older Hadoop, check this ourselves
> return !conf.getTrimmed(DFSConfigKeys.DFS_ENCRYPTION_KEY_PROVIDER_URI,
> "").isEmpty();
> }
> return client.isHDFSEncryptionEnabled();
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]