Hi Matt,

Thanks for the detailed response. Really appreciate it.

Few comments –

·         “if loading a DBCPConnectionPool for a HiveQL processor happens to 
work” – it does not allow to pick a DBCPConnectionPool. Only Hive pool is 
possible.

·          “The non-HiveQL processors make calls to the JDBC API that are not 
supported by the Hive JDBC driver” – Yeah. Even though the connection get 
created, but in PutSQL – only insert and update commands work. As soon as 
stmt.addBatch(); is called for Hive DDL, SQL Exception is thrown.

·         Support for Kerberos is definitely a big factor. Don’t know how to 
deal with this if I use DBCP.

·         “Hive version mismatch between NiFi and HDI 3.4”. I am using NiFi 0.7 
(which uses hive-jdbc-2.0.0) and the hive standalone jar from HDI cluster is 
hive-jdbc-1.2.1000.2.4.2.4-6-standalone. Not sure, if this should be an issue.

Regarding a possible solution – I am thinking about doing exactly the same what 
you mentioned (as a quick fix) – to take putHiveSQL and change it to use 
DBCPConnectionPool. But before starting on that, I will definitely try to dig 
deeper on why Hive DBCP is not working for HDP 3.4.

As of now, I don’t intend to use Query/Load data in Hive from NiFi at all. Only 
thing I am trying to do is – call certain DDL commands from NiFi. For example – 
I have modified the PutHDFS processor to return a flag if a new directory is 
created by an incoming flow file. If yes, we just want to call Alter Table add 
partition to refresh Hive metadata with newly created partition.


Regards,
Manish

From: Matt Burgess [mailto:[email protected]]
Sent: Friday, September 30, 2016 7:13 PM
To: [email protected]
Subject: Re: PutHiveQL and Hive Connection Pool with HDInsight

Manish,

Sorry to hear you're having issues connecting.  I should mention though that 
the fact that DBCPConnectionPool works with the Hive JDBC connection string 
doesn't imply that the processors that use DBCPConnectionPool will work for 
Hive. Notably there are three differences:

1) The Hive JDBC driver has many JARs, so to use DBCPConnectionPool 
successfully (in NiFi 1.0.0 due to NIFI-2604 [1]) with a processor not in the 
Hive bundle, you'd need to add all the JARs for the Hive driver. This is not 
possible in NiFi 0.x, instead you'd need the fat/standalone JAR for the Hive 
driver. The Hive bundle includes the driver, so if loading a DBCPConnectionPool 
for a HiveQL processor happens to work, I suspect it's because all the Hive 
JARs are in the classpath of the classloader (from the Hive processor(s)) used 
to instantiate the connection.

2) The non-HiveQL processors make calls to the JDBC API that are not supported 
by the Hive JDBC driver. The HiveQL processors specifically avoid those methods 
that are not supported, but ExecuteSQL and PutSQL (for example) do not.

3) DBCPConnectionPool does not support Kerberos, and if there are settings for 
the Hive driver that are not supported on the URL, then they must be in a 
config file (hive-site.xml, e.g.) and DBCPConnectionPool doesn't support that 
either. Perhaps for your use case this is not an issue.

I'd like to find the root of your problem if possible (maybe a Hive version 
mismatch between NiFi and HDI 3.4?), rather than have the need for a custom 
processor.  Even so, you shouldn't need to code a full custom processor for 
this, instead you could copy PutHiveQL and replace the references to 
HiveDBCPService with DBCPService, and change the call to getConnectionURL() to 
whatever URL you want to be recorded for provenance (not sure the connection 
string is available via java.sql.Connection, which is why the additional 
interface method was added).

If you do go down the custom processor path and get something working, I 
encourage you to share your findings with the community, in that case I'd 
imagine there are improvement(s) that can be made to the existing processor(s) 
so as to avoid the need for a custom one.

Regards,
Matt

[1] https://issues.apache.org/jira/browse/NIFI-2604

On Fri, Sep 30, 2016 at 7:59 AM, Manish Gupta 8 
<[email protected]<mailto:[email protected]>> wrote:
Tried couple of more connection options, but always got an error while setting 
up Hive Connection Pool. What’s strange is DBCP Connection Pool works fine with 
the same Hive JDBC connection string.

Now I am writing a custom “PutSQL” like processor that uses standard DBCP 
Controller service and allows to run DDL commands on Hive (since standard 
PutSQL does not allow DDL statements - only insert and update works). 
Basically, I’ll be writing a custom PutHiveQL that can work on standard DBCP.

Regards,
Manish

From: Manish Gupta 8 [mailto:[email protected]<mailto:[email protected]>]
Sent: Friday, September 30, 2016 3:09 AM

To: [email protected]<mailto:[email protected]>
Subject: RE: PutHiveQL and Hive Connection Pool with HDInsight

Tried with different combinations, but couldn’t succeed. It’s a HDI 3.4 Cluster 
with default hive settings.

Some of the errors I received:


1.       PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] failed to process due to 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/default;hive.server2.transport.mode=http;hive.server2.thrift.http.path=/<http://somehdiclustername.azurehdinsight.net:443/default;hive.server2.transport.mode=http;hive.server2.thrift.http.path=/>:
 java.net.SocketException: Connection reset); rolling back session: 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/default;hive.server2.transport.mode=http;hive.server2.thrift.http.path=/<http://somehdiclustername.azurehdinsight.net:443/default;hive.server2.transport.mode=http;hive.server2.thrift.http.path=/>:
 java.net.SocketException: Connection reset).



2.       failed to process session due to java.lang.NoSuchFieldError: INSTANCE: 
java.lang.NoSuchFieldError: INSTANCE



3.       PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] failed to process due to 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;hive.server2.thrift.http.path=/<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;hive.server2.thrift.http.path=/>:
 Invalid status 72); rolling back session: 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;hive.server2.thrift.http.path=/<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;hive.server2.thrift.http.path=/>:
 Invalid status 72)



4.       PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] failed to process due to 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;httpPath=/<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;httpPath=/>:
 Invalid status 72); rolling back session: 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;httpPath=/<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http;httpPath=/>:
 Invalid status 72)



5.       PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] failed to process due to 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http>:
 Invalid status 72); rolling back session: 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?transportMode=http>:
 Invalid status 72)



Regards,
Manish

From: Manish Gupta 8 [mailto:[email protected]]
Sent: Friday, September 30, 2016 12:44 AM
To: [email protected]<mailto:[email protected]>
Subject: RE: PutHiveQL and Hive Connection Pool with HDInsight

Thank you Matt. I did tried with hive.server2.transport.mode=http like this 
jdbc:hive2:// 
somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?hive.server2.transport.mode=http;hive.server2.thrift.http.path=/hive2<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?hive.server2.transport.mode=http;hive.server2.thrift.http.path=/hive2>.
But, I was getting java.lang.NoSuchFieldError: INSTANCE: 
java.lang.NoSuchFieldError: INSTANCE.

I will try again with transportMode=http and/or httpPath=cliservice.

But, as per hive’s documentation, right syntax should be 
hive.server2.transport.mode.(https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2)

Regards,
Manish

From: Matt Burgess [mailto:[email protected]]
Sent: Thursday, September 29, 2016 7:28 PM
To: [email protected]<mailto:[email protected]>
Subject: Re: PutHiveQL and Hive Connection Pool with HDInsight

Manish,

According to [1], status 72 means a bad URL, perhaps you need a transportMode 
and/or httpPath parameter in the URL (as described in the post)?

Regards,
Matt

[1] 
https://community.hortonworks.com/questions/23864/hive-http-transport-mode-problem.html


On Thu, Sep 29, 2016 at 9:06 AM, Manish Gupta 8 
<[email protected]<mailto:[email protected]>> wrote:
Hi,

I am not able to use PutHiveQL when accessing Hive on HDInsight. I am using 
NiFi 0.7.


•         Tried specifying the URL in couple of different ways. If I follow 
Azure Documentation 
(https://azure.microsoft.com/en-in/documentation/articles/hdinsight-connect-hive-jdbc-driver/)
 and specify the URL as jdbc:hive2:// 
somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?hive.server2.transport.mode=http;hive.server2.thrift.http.path=/hive2<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true?hive.server2.transport.mode=http;hive.server2.thrift.http.path=/hive2>,
 then I get a “failed to process session due to java.lang.NoSuchFieldError: 
INSTANCE: java.lang.NoSuchFieldError: INSTANCE”.

•         I tried using hive-jdbc jars from my cluster (dropping them into 
lib), but then NiFi didn’t start (some javax.xml.parsers conflicts).

•         When I use 
“jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname<http://somehdiclustername.azurehdinsight.net:443/somedbname>”,
 then I get following error.
 Is this issue because of https://issues.apache.org/jira/browse/NIFI-2575 or my 
connection settings are incorrect? Any workaround? /Any reference 
settings/example for HDI?
All I need to do is call an Alter Table Add Partition command in Hive from NiFi 
(once a day). Should I use HWI/Custom processor?

2016-09-29 08:18:48,194 INFO [StandardProcessScheduler Thread-1] 
o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] to run with 1 threads
2016-09-29 08:18:48,194 INFO [Timer-Driven Process Thread-6] 
o.a.nifi.dbcp.hive.HiveConnectionPool 
HiveConnectionPool[id=4d7f766a-1177-4f1d-a376-6ba5b84bf856] Simple 
Authentication
2016-09-29 08:18:48,262 INFO [Timer-Driven Process Thread-6] 
org.apache.hive.jdbc.Utils Supplied authorities: 
somehdiclustername.azurehdinsight.net:443<http://somehdiclustername.azurehdinsight.net:443>
2016-09-29 08:18:48,263 INFO [Timer-Driven Process Thread-6] 
org.apache.hive.jdbc.Utils Resolved authority: 
somehdiclustername.azurehdinsight.net:443<http://somehdiclustername.azurehdinsight.net:443>
2016-09-29 08:18:48,468 INFO [Timer-Driven Process Thread-6] 
org.apache.hive.jdbc.HiveConnection Transport Used for JDBC connection: null
2016-09-29 08:18:48,468 ERROR [Timer-Driven Process Thread-6] 
o.a.nifi.dbcp.hive.HiveConnectionPool 
HiveConnectionPool[id=4d7f766a-1177-4f1d-a376-6ba5b84bf856] Error getting Hive 
connection
2016-09-29 08:18:48,484 ERROR [Timer-Driven Process Thread-6] 
o.a.nifi.dbcp.hive.HiveConnectionPool
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72)
                at 
org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
 ~[commons-dbcp-1.4.jar:1.4]
                at 
org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
 ~[commons-dbcp-1.4.jar:1.4]
                at 
org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
 ~[commons-dbcp-1.4.jar:1.4]
                at 
org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:289)
 ~[nifi-hive-processors-0.7.0.jar:0.7.0]
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_102]
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_102]
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_102]
                at java.lang.reflect.Method.invoke(Method.java:498) 
~[na:1.8.0_102]
                at 
org.apache.nifi.controller.service.StandardControllerServiceProvider$1.invoke(StandardControllerServiceProvider.java:166)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at com.sun.proxy.$Proxy89.getConnection(Unknown Source) [na:na]
                at 
org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:152) 
[nifi-hive-processors-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
 [nifi-api-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1054)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:127)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[na:1.8.0_102]
                at 
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_102]
                at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
 [na:1.8.0_102]
                at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
 [na:1.8.0_102]
                at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
[na:1.8.0_102]
                at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
[na:1.8.0_102]
                at java.lang.Thread.run(Thread.java:745) [na:1.8.0_102]
Caused by: java.sql.SQLException: Could not open client transport with JDBC 
Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72
                at 
org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:207) 
~[hive-jdbc-2.0.0.jar:2.0.0]
                at 
org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:152) 
~[hive-jdbc-2.0.0.jar:2.0.0]
                at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) 
~[hive-jdbc-2.0.0.jar:2.0.0]
                at 
org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
 ~[commons-dbcp-1.4.jar:1.4]
                at 
org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
 ~[commons-dbcp-1.4.jar:1.4]
                at 
org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
 ~[commons-dbcp-1.4.jar:1.4]
                at 
org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
 ~[commons-dbcp-1.4.jar:1.4]
                ... 22 common frames omitted
Caused by: org.apache.thrift.transport.TTransportException: Invalid status 72
                at 
org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
 ~[libthrift-0.9.3.jar:0.9.3]
                at 
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
 ~[libthrift-0.9.3.jar:0.9.3]
                at 
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:307) 
~[libthrift-0.9.3.jar:0.9.3]
                at 
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 ~[libthrift-0.9.3.jar:0.9.3]
                at 
org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:181) 
~[hive-jdbc-2.0.0.jar:2.0.0]
                ... 28 common frames omitted
2016-09-29 08:18:48,484 ERROR [Timer-Driven Process Thread-6] 
o.apache.nifi.processors.hive.PutHiveQL 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] 
PutHiveQL[id=05505d0c-eee1-48bc-8a99-b53302118933] failed to process due to 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72); rolling back session: 
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72)
2016-09-29 08:18:48,499 ERROR [Timer-Driven Process Thread-6] 
o.apache.nifi.processors.hive.PutHiveQL
org.apache.nifi.processor.exception.ProcessException: 
org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72)
                at 
org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:293)
 ~[na:na]
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_102]
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_102]
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_102]
                at java.lang.reflect.Method.invoke(Method.java:498) 
~[na:1.8.0_102]
                at 
org.apache.nifi.controller.service.StandardControllerServiceProvider$1.invoke(StandardControllerServiceProvider.java:166)
 ~[nifi-framework-core-0.7.0.jar:0.7.0]
                at com.sun.proxy.$Proxy89.getConnection(Unknown Source) ~[na:na]
                at 
org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:152) ~[na:na]
                at 
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
 ~[nifi-api-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1054)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:127)
 [nifi-framework-core-0.7.0.jar:0.7.0]
                at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[na:1.8.0_102]
                at 
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_102]
                at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
 [na:1.8.0_102]
                at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
 [na:1.8.0_102]
                at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
[na:1.8.0_102]
                at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
[na:1.8.0_102]
                at java.lang.Thread.run(Thread.java:745) [na:1.8.0_102]
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create 
PoolableConnectionFactory (Could not open client transport with JDBC Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72)
                at 
org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
 ~[na:na]
                at 
org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
 ~[na:na]
                at 
org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
 ~[na:na]
                at 
org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:289)
 ~[na:na]
                ... 19 common frames omitted
Caused by: java.sql.SQLException: Could not open client transport with JDBC 
Uri: 
jdbc:hive2://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true<http://somehdiclustername.azurehdinsight.net:443/somedbname;ssl=true>:
 Invalid status 72
                at 
org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:207) 
~[na:na]
                at 
org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:152) ~[na:na]
                at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) 
~[na:na]
                at 
org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
 ~[na:na]
                at 
org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
 ~[na:na]
                at 
org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
 ~[na:na]
                at 
org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
 ~[na:na]
                ... 22 common frames omitted
Caused by: org.apache.thrift.transport.TTransportException: Invalid status 72
                at 
org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
 ~[na:na]
                at 
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
 ~[na:na]
                at 
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:307) 
~[na:na]
                at 
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 ~[na:na]
                at 
org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:181) 
~[na:na]
                ... 28 common frames omitted



Thanks,
Manish



Reply via email to