I saw a  mail named "HCatalog Security",His or her problem was similar to 
mine,and the reply answer were:
"This issue goes away after doing a kinit -R".

So I did the same operation.while it is failed:
kinit: Ticket expired while renewing credentials

But in my /etc/krb5.conf, I have configed this item:
renew_lifetime=7d

So, Can anybody give me some suggestions, please? Thankyou.

At 2016-07-04 11:32:30, "Maria" <linanmengxia...@126.com> wrote:
>
>
>And  I can suucessfully access hiveserver2 from beeline.
>
>
>I was so confused by this error"Peer indicated failure: GSS initiate failed".
>
> Can you anybody please help me? Any reply will be much appreciated.
>
>At 2016-07-04 11:26:53, "Maria" <linanmengxia...@126.com> wrote:
>>Yup,my  hiveserver2 log errors are:
>>
>>ERROR [Hiveserver2-Handler-Pool: 
>>Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error 
>>occurred during processing of message.
>>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: 
>>Peer indicated failure: GSS initiate failed
>>    at 
>> org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>>    at 
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>>    at 
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>>    at java.security.AccessController.doPrivileged(Native Method)
>>    at javax.security.auth.Subject.doAs(Subject.java:356)
>>    at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>>    at 
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>>    at 
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>>    at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>    at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>    at java.lang.Thread.run(Thread.java:745)
>>Caused by: org.apache.thrift.transport.TTransportException:Peer indicated 
>>failure: GSS initiate failed
>>    at 
>> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>>    at 
>> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>>    at 
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>    at 
>> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>>    at 
>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>> ... 10 more
>>================================================
>>As if the windows  hive JDBC client can communicate with the 
>>hiveserver2,isn't it?
>>
>>while I checked everything I can :
>>(1)in hiveserver2 node, I execute command "klist",the results are:
>>Ticket cache: FILE:/tmp/krb5cc_0
>>Default principal: hive/h...@hadoop.com
>>
>>Valid starting    Expires                     Service principal
>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/hadoop....@hadoop.com
>>                 renew until 07/04/16 10:28:14
>>(2)in windows dos cmd,I execute command "klist",the results are:
>>Ticket cache:API: 1
>>Default principal: hive/h...@hadoop.com
>>
>>Valid starting    Expires                     Service principal
>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/hadoop....@hadoop.com
>>                 renew until 07/04/16 10:24:32
>>
>> Is there any thing else I have to add or set for hiveserver2?
>>
>>Thanks in advance.
>>
>>
>>Maria.
>>
>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vivshrivast...@gmail.com> wrote:
>> 
>>
>>Please look at the hiveserver2 log, it will have better error information. 
>>You can paste error from the logs if you need help. 
>>
>>
>>Regards,
>>
>>
>>Vivek
>>
>>
>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <linanmengxia...@126.com> wrote:
>>
>>
>>
>>Hi,all:
>>
>>     recently,I  attempted to access Kerberized hadoop cluster by launching 
>>JAVA applications from Windows workstations. And I hava configured kerberos 
>>in my windows7, and can successfully access hdfs50070. But when I launch JDBC 
>>from windows to connection remote hiveserver,errors accured:
>>
>>java.sql.SQLException:could not open client transport with JDBC 
>>Uri:jdbc:hive2://hm:10000/default;principal=hive/h...@hadoom.com: GSS 
>>initiate failed
>>
>>     at 
>>org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>>
>>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>>
>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>>
>>     at java.sql.DriverManager.getConnection(Unknown Source)
>>
>>     at java.sql.DriverManager.getConnection(Unknown Source)
>>
>>     at 
>>org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>>
>>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed
>>
>>     at 
>>org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>>
>>     at 
>>org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>>
>>     at 
>>org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>
>>     at 
>>org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>>
>>     at 
>>org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>>
>>     at java.security.AccessController.doPrivileged(Native Method)
>>
>>     at javax.security.auth.Subject.doAs(Unknow source)
>>
>>     at 
>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>
>>     at  
>>org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>
>>     at 
>>org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>>
>>... 5 more
>>
>>------------------------------------------------------------------------------
>>
>>below are my test codes:
>>
>>
>>
>>public static void main(String[] args) {
>>
>>    String principal = "hive/h...@hadoom.com";
>>
>>    String keytab = "E:\\Program Files 
>>(x86)\\java\\jre7\\lib\\security\\hive.keytab";
>>
>>    String url = 
>>"jdbc:hive2://hm:10000/default;principal=hive/h...@hadoom.com";
>>
>>
>>
>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>>
>>    conf.addResource(new File("core-site.xml").toURI().toURL());
>>
>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>>
>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>>
>>
>>
>>    conf.set("hadoop.security.authentication", "Kerberos");
>>
>>    UserGroupInformation.setConfiguration(conf);
>>
>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>>
>>
>>
>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>>
>>    Connection conn =DriverManager.getConnection(url);
>>
>>
>>
>>    Statement stmt = conn.createStatement();
>>
>>    String sql = "select * from testkerberos";
>>
>>    ResultSet rs = stmt.executeQuery(sql);
>>
>>    while (rs.next()) {
>>
>>       system.out.println(rs.getString(1));
>>
>>    }
>>
>>}
>>
>>
>>
>>Does anyone had the same problem? Or know how to solve it ?
>>
>>
>>
>>Thanks in advance.
>>
>>
>>
>>Maria.
>>
>>
>>

Reply via email to