I am able to connect to hive via beeline using same connection URL. Also the 
java code works without oozie.

PFB beeline  logs: 

[admin@hscale-qa1-dn3 ~]$ beeline
Beeline version 0.14.0.2.2.8.0-3150 by Apache Hive
beeline> !connect 
jdbc:hive2://10.60.2.26:10000/default;principal=hive/[email protected]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.2.8.0-3150/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.2.8.0-3150/hive/lib/hive-jdbc-0.14.0.2.2.8.0-3150-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
scan complete in 3ms
Connecting to 
jdbc:hive2://10.60.2.26:10000/default;principal=hive/[email protected]
Enter username for 
jdbc:hive2://10.60.2.26:10000/default;principal=hive/[email protected]:
 hive
Enter password for 
jdbc:hive2://10.60.2.26:10000/default;principal=hive/[email protected]:
Connected to: Apache Hive (version 0.14.0.2.2.8.0-3150)
Driver: Hive JDBC (version 0.14.0.2.2.8.0-3150)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://10.60.2.26:10000/default> show tables
. . . . . . . . . . . . . . . . . . . . > ;
+-----------------------------------------------------------+--+
|                         tab_name                          |
+-----------------------------------------------------------+--+
| patient_detail                                            |
| sss1_emp3_temp                                            |
| testngssrqa2_testng_patient_detail_withbatchid10_sscode2  |
| testngssrqa2_testng_patient_detail_withbatchid15          |
| testngssrqa2_testng_patient_detail_withbatchid55_sscode2  |
| testngssrqa2_testng_patient_detail_withbatchid6           |
| testorg2_testssc1_emp3                                    |
| testorg2_testssc1_emp4                                    |
| testorg2_testssc1_encounter                               |
| testorg2_testssc1_newciti                                 |
| testorg2_testssc1_patient_detail_temp                     |
+-----------------------------------------------------------+--+
11 rows selected (0.328 seconds)
0: jdbc:hive2://10.60.2.26:10000/default>


Regards,
Ashish


-----Original Message-----
From: Aaron.Dossett [mailto:[email protected]] 
Sent: 21 March 2016 18:16
To: [email protected]
Subject: Re: Unable to execute Hive queries through oozie java action on 
kerberized cluster

Does that connection string work in other contexts outside of oozie, e.g.
beeline?  Does your java code run if executed directly instead of via Oozie?

On 3/21/16, 4:52 AM, "Ashish Gupta" <[email protected]> wrote:

>Hi All,
>
>I am trying to execute hive queries through oozie Java action on 
>secured cluster, however I am getting below exception:
>
>Exception Stack trace:
>
>
>>>> Invoking Main class now >>>
>
>
>
>Launch time = 1458542514082
>
>Job launch time = 1458542514082 mapreduce.job.tags = 
>oozie-2e8d7ed9fc7551a353667830e09bef2b
>
>Main class        : com.citiustech.main.Test
>
>Arguments         :
>
>
>
>
>
><<< Invocation of Main class completed <<<
>
>
>
>Failing Oozie Launcher, Main class
>[org.apache.oozie.action.hadoop.JavaMain], main() threw exception,
>
>org.springframework.jdbc.CannotGetJdbcConnectionException: Could not 
>get JDBC Connection; nested exception is java.sql.SQLException:
>
>Could not open client transport with JDBC Uri:
>jdbc:hive2://10.60.2.26:10000/default;principal=hive/example-qa1-
>
>[email protected];auth=NOSASL: GSS initiate failed
>
>org.apache.oozie.action.hadoop.JavaMainException:
>org.springframework.jdbc.CannotGetJdbcConnectionException: Could not 
>get JDBC
>
>Connection; nested exception is java.sql.SQLException: Could not open 
>client transport with JDBC Uri:
>
>jdbc:hive2://10.60.2.26:10000/default;principal=hive/example-qa1-dn2@EX
>AMP
>LE.com;auth=NOSASL: GSS initiate failed
>
>  at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)
>
>  at 
> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
>
>  at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)
>
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>  at
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>57)
>
>  at
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso
>rIm
>pl.java:43)
>
>  at java.lang.reflect.Method.invoke(Method.java:606)
>
>  at
>org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:2
>26)
>
>  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>
>  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:415)
>
>  at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
>java:1671)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>
>Caused by: org.springframework.jdbc.CannotGetJdbcConnectionException:
>Could not get JDBC Connection; nested exception is
>
>java.sql.SQLException: Could not open client transport with JDBC Uri:
>jdbc:hive2://10.60.2.26:10000/default;principal=hive/example-
>
>[email protected];auth=NOSASL: GSS initiate failed
>
>  at
>org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataS
>our
>ceUtils.java:80)
>
>  at
>org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:39
>1)
>
>  at
>org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:471)
>
>  at
>org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:481)
>
>  at
>org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.
>jav
>a:491)
>
>  at
>org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.
>jav
>a:497)
>
>  at com.citiustech.main.Test.main(Test.java:26)
>
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>  at
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>57)
>
>  at
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso
>rIm
>pl.java:43)
>
>  at java.lang.reflect.Method.invoke(Method.java:606)
>
>  at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
>
>  ... 15 more
>
>Caused by: java.sql.SQLException: Could not open client transport with 
>JDBC Uri:
>
>jdbc:hive2://10.60.2.26:10000/default;principal=hive/example-qa1-dn2@EX
>AMP
>LE.com;auth=NOSASL: GSS initiate failed
>
>  at
>org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:2
>10)
>
>  at 
> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:156)
>
>  at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>
>  at java.sql.DriverManager.getConnection(DriverManager.java:571)
>
>  at java.sql.DriverManager.getConnection(DriverManager.java:187)
>
>  at
>org.springframework.jdbc.datasource.DriverManagerDataSource.getConnecti
>onF
>romDriverManager
>
>(DriverManagerDataSource.java:153)
>
>  at
>org.springframework.jdbc.datasource.DriverManagerDataSource.getConnecti
>onF
>romDriver(DriverManagerDataSource.java:144)
>
>  at
>org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getCo
>nne
>ctionFromDriver
>
>(AbstractDriverBasedDataSource.java:155)
>
>  at
>org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getCo
>nne
>ction(AbstractDriverBasedDataSource.java:120)
>
>  at
>org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(Dat
>aSo
>urceUtils.java:111)
>
>  at
>org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataS
>our
>ceUtils.java:77)
>
>  ... 26 more
>
>Caused by: org.apache.thrift.transport.TTransportException: GSS 
>initiate failed
>
>  at
>org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTra
>nsp
>ort.java:232)
>
>  at
>org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316
>)
>
>  at
>org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransp
>ort
>.java:37)
>
>  at
>org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAs
>sum
>ingTransport.java:52)
>
>  at
>org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAs
>sum
>ingTransport.java:49)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:415)
>
>  at
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
>java:1671)
>
>  at
>org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAss
>umi
>ngTransport.java:49)
>
>  at
>org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:1
>85)
>
>  ... 36 more
>
>
>
>Oozie Launcher failed, finishing Hadoop job gracefully
>
>
>
>Oozie Launcher, uploading action data to HDFS sequence file:
>hdfs://example-qa1-nn:8020/user/admin/oozie-oozi/0000000-
>
>160317141545276-oozie-oozi-W/javaAction--java/action-data.seq
>
>
>
>Oozie Launcher ends
>
>
>Please find below workflow xml specifying oozie java action and java 
>program for executing hive queries:
>
>Workflow.xml:
>
>
><workflow-app name="WorkFlowForJavaActionToExecuteHiveQuery"
>xmlns="uri:oozie:workflow:0.2.5">
>
>  <credentials>
>
>
>
>
>
>  <credential name='hive_credentials' type='hcat'>
>
>                        <property>
>
>                            <name>hcat.metastore.uri</name>
>
>                            
> <value>thrift://example-qa1-dn2:9083</value>
>
>                        </property>
>
>                        <property>
>
>                            <name>hcat.metastore.principal</name>
>
>                            <value>hive/[email protected]</value>
>
>                        </property>
>
>                </credential>
>
></credentials>
>
>    <start to="javaAction"/>
>
>
>
>
>
>
>
>
>
><action name="javaAction" cred="hive_credentials">
>
>        <java>
>
>            <job-tracker>${jobTracker}</job-tracker>
>
>            <name-node>${nameNode}</name-node>
>
>
>
>  <main-class>com.citiustech.main.Test</main-class>
>
>        </java>
>
>        <ok to="end"/>
>
>        <error to="fail"/>
>
>    </action>
>
>
>
>     <kill name="fail">
>
>        <message>Job failed, error
>
>            message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>
>    </kill>
>
>    <end name="end"/>
>
></workflow-app>
>
>
>Java Program:
>
>
>package com.citiustech.main;
>
>
>
>import java.io.IOException;
>
>
>
>import org.apache.hadoop.security.UserGroupInformation;
>
>import org.springframework.jdbc.core.JdbcTemplate;
>
>import org.springframework.jdbc.datasource.DriverManagerDataSource;
>
>
>
>public class Test {
>
>  public static void main(String[] args) throws IOException {
>
>         UserGroupInformation.loginUserFromKeytab("[email protected]",
>"/etc/security/keytabs/admin.keytab");
>
>
>
>         org.apache.hadoop.conf.Configuration conf = new 
>org.apache.hadoop.conf.Configuration();
>
>      conf.set("hadoop.security.authentication", "Kerberos");
>
>      UserGroupInformation.setConfiguration(conf);
>
>      UserGroupInformation.loginUserFromKeytab("[email protected]",
>"/etc/security/keytabs/admin.keytab");
>
>
>
>
>
>
>
>         DriverManagerDataSource dataSource = new 
> DriverManagerDataSource
>
>("jdbc:hive2://10.60.2.26:10001/default;principal=hive/example-qa1-dn2@
>EXA
>MPLE.com", "hive", "");
>
>         
> dataSource.setDriverClassName("org.apache.hive.jdbc.HiveDriver");
>
>         JdbcTemplate jdbcTemplate = new JdbcTemplate();
>
>         jdbcTemplate.setDataSource(dataSource);
>
>
>
>         long count = 0;
>
>         count = jdbcTemplate.queryForObject("SELECT COUNT(*) FROM 
>provider", Long.class);
>
>         System.out.println("Count is 
> *************************"+count);
>
>  }
>
>}
>
>I have submitted workflow job using admin user who have access to hive.
>
>I did kinit programmatically as well using UserGroupInformation API for 
>admin user, but still I am getting above exception.
>I have kept only spring core, spring jdbc and hive jdbc jars in oozie 
>HDFS cache.
>
>I also tried all possible solutions mentioned on hortonworks, stack 
>overflow and other forums, however I am still facing same issue.
>
>Can anyone suggest any solution for this issue?
>
>
>Thanks & Regards,
>Ashish
>
>
>
>
>
>

Reply via email to