[
https://issues.apache.org/jira/browse/HIVE-13384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15288963#comment-15288963
]
Bing Li commented on HIVE-13384:
Refer to Drill-3413, we found the method to resolve this issue in the client
side.
The key point is that to get the delegation token for the proxy user, and
assign it to hive.metastore.token.signature.
I tried this method in two different scenario:
1. use the proxy user to initialize an object of HiveMetaStoreClient, which is
mentioned in the description
2. access to Hive table in Pig via HCatalog
Here are the sample codes for above two scenarios:
1. use the proxy use to create HiveMetaStoreClient object
UserGroupInformation loginUser = UserGroupInformation.getLoginUser ();
// in this example, the loginUser is user hive
// the "loginuser" impersonates user hdfs
UserGroupInformation ugi = UserGroupInformation.createProxyUser ("hdfs",
loginUser);
// in this example, user hive is the super user
// which will do the login with its keytab and principle
// user hdfs is the proxyuser
HiveMetaStoreClient realUserClient = new HiveMetaStoreClient(new
HiveConf());
// get the delegation token for proxyuser hdfs, and the owner of this
token is hdfs as well
String delegationTokenStr =
realUserClient.getDelegationToken("hdfs","hdfs");
realUserClient.close();
String DELEGATION_TOKEN = "DelegationTokenForHiveMetaStoreServer";
// create a delegation token object and add it to the given UGI
Utils.setTokenStr(ugi, delegationTokenStr, DELEGATION_TOKEN);
ugi.doAs (new PrivilegedExceptionAction () {
public Void run () throws Exception
{
hiveConf = new HiveConf ();
hiveConf.set("hive.metastore.token.signature",DELEGATION_TOKEN);
client = new HiveMetaStoreClient (hiveConf);
return null;
}
});
2. In Pig Java program
HiveConf hiveConf = new HiveConf();
HCatClient client = HCatClient.create(hiveConf);
UserGroupInformation ugi =
UserGroupInformation.createProxyUser(proxyUser,
UserGroupInformation.getLoginUser());
// get and set the delegation token
String tokenStrForm = client.getDelegationToken(proxyUser, proxyUser);
String DELEGATION_TOKEN = "DelegationTokenForHiveMetaStoreServer";
Utils.setTokenStr(ugi, tokenStrForm, DELEGATION_TOKEN);
Properties pigProp = new Properties();
pigProp.setProperty("hive.metastore.token.signature",DELEGATION_TOKEN );
client.close();
// initialize pigServer with the pigProperty
PigServer pigServer = new PigServer(ExecType.MAPREDUCE, pigProp);
ugi.doAs(new PrivilegedExceptionAction() {
public Void run() throws Exception {
loadJars(pigServer); // customize method
runQuery(pigServer); // customize method
return null;
}
});
> Failed to create HiveMetaStoreClient object with proxy user when Kerberos
> enabled
> -
>
> Key: HIVE-13384
> URL: https://issues.apache.org/jira/browse/HIVE-13384
> Project: Hive
> Issue Type: Improvement
> Components: Metastore
>Affects Versions: 1.2.0, 1.2.1
>Reporter: Bing Li
>
> I wrote a Java client to talk with HiveMetaStore. (Hive 1.2.0)
> But found that it can't new a HiveMetaStoreClient object successfully via a
> proxy using in Kerberos env.
> ===
> 15/10/13 00:14:38 ERROR transport.TSaslTransport: SASL negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to find
> any Kerberos tgt)]
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> ==
> When I debugging on Hive, I found that the error came from open() method in
> HiveMetaStoreClient class.
> Around line 406,
> transport = UserGroupInformation.getCurrentUser().doAs(new
> PrivilegedExceptionAction() { //FAILED, because the current user
> doesn't have the cridential
> But it will work if I change above line to
> transport = UserGroupInformation.getCurrentUser().getRealUser().doAs(new
> PrivilegedExceptionAction() { //PASS
> I found DRILL-3413 fixes this error in Drill side as a workaround. But if I
> submit a mapreduce job via Pig/HCatalog, it runs into the same issue again
> when initialize the object via HCatalog.
> It would be better to fix this issue in