[
https://issues.apache.org/jira/browse/FLUME-2433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15202772#comment-15202772
]
Ping Wang commented on FLUME-2433:
----------------------------------
Hi Roshan,
I am using Flume with Hive in Kerberos cluster, and found your patch here. But
I hit a problem after applied your fix. I had a search for "Server not found in
Kerberos database (7) - UNKNOWN_SERVER" and have tried the solutions from
websites, All did not help. Could you please help give me some pointers?
Thanks a lot!
......
2016-03-19 01:28:44,253 (hive-k1-call-runner-0) [INFO -
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:377)]
Trying to connect to metastore with URI thrift://***:9083
2016-03-19 01:28:44,316 (hive-k1-call-runner-0) [DEBUG -
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:243)]
opening transport org.apache.thrift.transport.TSaslClientTransport@34ae96b5
2016-03-19 01:28:44,334 (hive-k1-call-runner-0) [ERROR -
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:296)] SASL
negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException:
No valid credentials provided (Mechanism level: Server not found in Kerberos
database (7) - UNKNOWN_SERVER)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:432)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182)
at
org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
at
org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:231)
at
org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227)
at
com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767)
at
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
at
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
at
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
at
com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)
at
org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227)
at
org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202)
at
org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:448)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:274)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:243)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:180)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:58)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:167)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:162)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at
org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:161)
at org.apache.flume.sink.hive.HiveWriter$6.call(HiveWriter.java:316)
at org.apache.flume.sink.hive.HiveWriter$6.call(HiveWriter.java:313)
at org.apache.flume.sink.hive.HiveWriter$9.call(HiveWriter.java:366)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Server
not found in Kerberos database (7) - UNKNOWN_SERVER)
at
sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770)
at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 51 more
Caused by: KrbException: Server not found in Kerberos database (7) -
UNKNOWN_SERVER
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:259)
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:270)
at
sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:302)
at
sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:120)
at
sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
at
sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
... 54 more
Caused by: KrbException: Identifier doesn't match expected value (906)
at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
... 60 more
> Add kerberos support for Hive sink
> ----------------------------------
>
> Key: FLUME-2433
> URL: https://issues.apache.org/jira/browse/FLUME-2433
> Project: Flume
> Issue Type: Bug
> Components: Sinks+Sources
> Affects Versions: v1.5.0.1
> Reporter: Roshan Naik
> Assignee: Roshan Naik
> Labels: HiveSink, Kerberos,
> Attachments: FLUME-2433.patch, FLUME-2433.v2.patch
>
>
> Add kerberos authentication support for Hive sink
> FYI: The HCatalog API support for Kerberos is not available in hive 0.13.1
> this should be available in the next hive release.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)