Hi All,

I am trying to execute hive queries through oozie Java action on secured 
cluster, however I am getting below exception:

Exception Stack trace:


>>> Invoking Main class now >>>



Launch time = 1458542514082

Job launch time = 1458542514082 mapreduce.job.tags = 
oozie-2e8d7ed9fc7551a353667830e09bef2b

Main class        : com.citiustech.main.Test

Arguments         :





<<< Invocation of Main class completed <<<



Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.JavaMain], 
main() threw exception,

org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC 
Connection; nested exception is java.sql.SQLException:

Could not open client transport with JDBC Uri: 
jdbc:hive2://10.60.2.26:10000/default;principal=hive/example-qa1-

[email protected];auth=NOSASL: GSS initiate failed

org.apache.oozie.action.hadoop.JavaMainException: 
org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC

Connection; nested exception is java.sql.SQLException: Could not open client 
transport with JDBC Uri:

jdbc:hive2://10.60.2.26:10000/default;principal=hive/[email protected];auth=NOSASL:
 GSS initiate failed

  at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)

  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)

  at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)

  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

  at java.lang.reflect.Method.invoke(Method.java:606)

  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)

  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)

  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)

  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)

  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)

  at java.security.AccessController.doPrivileged(Native Method)

  at javax.security.auth.Subject.doAs(Subject.java:415)

  at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)

  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

Caused by: org.springframework.jdbc.CannotGetJdbcConnectionException: Could not 
get JDBC Connection; nested exception is

java.sql.SQLException: Could not open client transport with JDBC Uri: 
jdbc:hive2://10.60.2.26:10000/default;principal=hive/example-

[email protected];auth=NOSASL: GSS initiate failed

  at 
org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:80)

  at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:391)

  at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:471)

  at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:481)

  at 
org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.java:491)

  at 
org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.java:497)

  at com.citiustech.main.Test.main(Test.java:26)

  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

  at java.lang.reflect.Method.invoke(Method.java:606)

  at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)

  ... 15 more

Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri:

jdbc:hive2://10.60.2.26:10000/default;principal=hive/[email protected];auth=NOSASL:
 GSS initiate failed

  at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:210)

  at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:156)

  at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)

  at java.sql.DriverManager.getConnection(DriverManager.java:571)

  at java.sql.DriverManager.getConnection(DriverManager.java:187)

  at 
org.springframework.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriverManager

(DriverManagerDataSource.java:153)

  at 
org.springframework.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriver(DriverManagerDataSource.java:144)

  at 
org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getConnectionFromDriver

(AbstractDriverBasedDataSource.java:155)

  at 
org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getConnection(AbstractDriverBasedDataSource.java:120)

  at 
org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:111)

  at 
org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:77)

  ... 26 more

Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed

  at 
org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)

  at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)

  at 
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)

  at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)

  at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)

  at java.security.AccessController.doPrivileged(Native Method)

  at javax.security.auth.Subject.doAs(Subject.java:415)

  at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)

  at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)

  at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:185)

  ... 36 more



Oozie Launcher failed, finishing Hadoop job gracefully



Oozie Launcher, uploading action data to HDFS sequence file: 
hdfs://example-qa1-nn:8020/user/admin/oozie-oozi/0000000-

160317141545276-oozie-oozi-W/javaAction--java/action-data.seq



Oozie Launcher ends


Please find below workflow xml specifying oozie java action and java program 
for executing hive queries:

Workflow.xml:


<workflow-app name="WorkFlowForJavaActionToExecuteHiveQuery" 
xmlns="uri:oozie:workflow:0.2.5">

  <credentials>





  <credential name='hive_credentials' type='hcat'>

                        <property>

                            <name>hcat.metastore.uri</name>

                            <value>thrift://example-qa1-dn2:9083</value>

                        </property>

                        <property>

                            <name>hcat.metastore.principal</name>

                            <value>hive/[email protected]</value>

                        </property>

                </credential>

</credentials>

    <start to="javaAction"/>









<action name="javaAction" cred="hive_credentials">

        <java>

            <job-tracker>${jobTracker}</job-tracker>

            <name-node>${nameNode}</name-node>



  <main-class>com.citiustech.main.Test</main-class>

        </java>

        <ok to="end"/>

        <error to="fail"/>

    </action>



     <kill name="fail">

        <message>Job failed, error

            message[${wf:errorMessage(wf:lastErrorNode())}]</message>

    </kill>

    <end name="end"/>

</workflow-app>


Java Program:


package com.citiustech.main;



import java.io.IOException;



import org.apache.hadoop.security.UserGroupInformation;

import org.springframework.jdbc.core.JdbcTemplate;

import org.springframework.jdbc.datasource.DriverManagerDataSource;



public class Test {

  public static void main(String[] args) throws IOException {

         UserGroupInformation.loginUserFromKeytab("[email protected]", 
"/etc/security/keytabs/admin.keytab");



         org.apache.hadoop.conf.Configuration conf = new 
org.apache.hadoop.conf.Configuration();

      conf.set("hadoop.security.authentication", "Kerberos");

      UserGroupInformation.setConfiguration(conf);

      UserGroupInformation.loginUserFromKeytab("[email protected]", 
"/etc/security/keytabs/admin.keytab");







         DriverManagerDataSource dataSource = new DriverManagerDataSource

("jdbc:hive2://10.60.2.26:10001/default;principal=hive/[email protected]",
 "hive", "");

         dataSource.setDriverClassName("org.apache.hive.jdbc.HiveDriver");

         JdbcTemplate jdbcTemplate = new JdbcTemplate();

         jdbcTemplate.setDataSource(dataSource);



         long count = 0;

         count = jdbcTemplate.queryForObject("SELECT COUNT(*) FROM provider", 
Long.class);

         System.out.println("Count is *************************"+count);

  }

}

I have submitted workflow job using admin user who have access to hive.

I did kinit programmatically as well using UserGroupInformation API for admin 
user, but still I am getting above exception.
I have kept only spring core, spring jdbc and hive jdbc jars in oozie HDFS 
cache.

I also tried all possible solutions mentioned on hortonworks, stack overflow 
and other forums, however I am still facing same issue.

Can anyone suggest any solution for this issue?


Thanks & Regards,
Ashish






Reply via email to