Thanks Arjun...

I will still wait for the fix as well.

Thanks
Asim

On Fri, Mar 16, 2018 at 12:33 PM, Arjun kr <arjun...@outlook.com> wrote:

> Hi Asim,
>
>
> You can refer to section 'Modifying and Submitting a Physical Plan to
> Drill' in https://drill.apache.org/docs/query-plans/.
>
>
>
> explain plan for select * from `hivejdbc`.`testdb`.`test`
>
> { "head" : { "version" : 1, "generator" : { "type" : "ExplainHandler",
> "info" : "" }, "type" : "APACHE_DRILL_PHYSICAL", "options" : [ ], "queue" :
> 0, "hasResourcePlan" : false, "resultMode" : "EXEC" }, "graph" : [ { "pop"
> : "jdbc-scan", "@id" : 2, "sql" : "SELECT *\nFROM.testdb.test", "config" :
> { "type" : "jdbc", "driver" : "org.apache.hive.jdbc.HiveDriver", "url" :
> "jdbc:hive2://localhost:10000", "username" : "arjun", "password" :
> "arjun", "enabled" : true }, "userName" : "", "cost" : 100.0 }, { "pop" :
> "project", "@id" : 1, "exprs" : [ { "ref" : "`id`", "expr" : "`id`" }, {
> "ref" : "`name`", "expr" : "`name`" } ], "child" : 2, "initialAllocation" :
> 1000000, "maxAllocation" : 10000000000, "cost" : 100.0 }, { "pop" :
> "screen", "@id" : 0, "child" : 1, "initialAllocation" : 1000000,
> "maxAllocation" : 10000000000, "cost" : 100.0 } ] }
>
> You can remove the dot ("sql" contents) and execute it as described in the
> above link for testing purpose.
>
> The example picture in the link seems to be wrong.It shows text content
> being executed instead of json.
>
> Thanks,
>
> Arjun
>
> ________________________________
> From: Asim Kanungo <asim...@gmail.com>
> Sent: Friday, March 16, 2018 9:57 PM
> To: user@drill.apache.org
> Subject: Re: hive connection as generic jdbc
>
> Hi Kunal/Arjun,
>
> Yeah I tried from view as well but no luck.
> I need to learn how to execute the physical plan, if any easy way please
> let me know else I can go through documentation.
>
> Thanks to both of you for solving the connection issue though.
> It will be great if you can update in the same thread once you hear from
> the experts about the DOT issue described earlier.
>
> Thanks
> Asim
>
>
> On Thu, Mar 15, 2018 at 11:40 PM, Kunal Khatua <ku...@apache.org> wrote:
>
> > Not sure, but you could try exposing the table as a view, and then query
> > against that.
> >
> > On Thu, Mar 15, 2018 at 8:35 PM, Arjun kr <arjun...@outlook.com> wrote:
> >
> > >
> > > Yeah, I'm also getting the same error.The tables are getting listed
> > > though. The plan generates table name as '.db.table'. If I remove this
> > > extra dot in the physical plan and execute, using physical plan option,
> > it
> > > runs successfully. I would let Drill experts to comment on any possible
> > > solution for this.
> > >
> > > Thanks,
> > >
> > > Arjun
> > >
> > >
> > >
> > > ________________________________
> > > From: Asim Kanungo <asim...@gmail.com>
> > > Sent: Friday, March 16, 2018 4:51 AM
> > > To: user@drill.apache.org
> > > Subject: Re: hive connection as generic jdbc
> > >
> > > " as Thanks Arjun....
> > > I am able to use this and got a success message for adding the storage
> > > plugin.
> > >
> > > But while querying, I am getting a strange error. For example, if I do
> > > SELECT * FROM myhivetest.testschema.`testtable`; I am finding DRILL is
> > > submitting the query as :-
> > >
> > > sql SELECT *
> > > FROM.testschema.testtable
> > > plugin BIGRED
> > > Fragment 0:0
> > >
> > > If you see there is one extra DOT (.) before the schema name and when
> it
> > > going to my hive environment from DRILL it is failing for the extra
> DOT.
> > >   (org.apache.hive.service.cli.HiveSQLException) Error while compiling
> > > statement: FAILED: ParseException line 2:4 cannot recognize input near
> > '.'
> > > 'testschema' '.' in join source
> > >
> > > Also when I am running count(*) DRILL is assigning "$f0" as the alias
> and
> > > failing.
> > >
> > > Error: DATA_READ ERROR: The JDBC storage plugin failed while trying
> setup
> > > the SQL query.
> > >
> > > sql SELECT COUNT(*) AS CNT
> > > FROM (SELECT 0 AS $f0
> > > FROM. testschema . testtable ) AS t
> > > plugin myhivetest
> > > Fragment 0:0
> > >
> > > [Error Id: d6a2fdf6-7979-4415-8d08-afbcd3667bde on
> > > rs-master.redstack.com:31010] (state=,code=0)
> > >
> > > Please try from your side, and let me know if you are facing the same
> > > issue.
> > >
> > >
> > > On Thu, Mar 15, 2018 at 1:49 AM, Arjun kr <arjun...@outlook.com>
> wrote:
> > >
> > > > Hi Asim,
> > > >
> > > > I was able to connect to Hive 1.2 using jdbc storage plugin with two
> > > below
> > > > jars. You may give it a try with these jars.
> > > >
> > > > http://central.maven.org/maven2/org/apache/hive/hive-
> > > > service/1.1.1/hive-service-1.1.1.jar
> > > >
> > > > http://central.maven.org/maven2/org/apache/hive/hive-
> > > > jdbc/1.1.1/hive-jdbc-1.1.1.jar
> > > >
> > > > Thanks,
> > > >
> > > > Arjun
> > > >
> > > >
> > > >
> > > > ________________________________
> > > > From: Arjun kr
> > > > Sent: Wednesday, March 14, 2018 1:05 PM
> > > > To: user@drill.apache.org
> > > > Subject: Re: hive connection as generic jdbc
> > > >
> > > >
> > > > Looks like hive-jdbc-1.1.1-standalone.jar has 'slf4j-log4j'
> bundled.You
> > > > may try cloning below repo for hive uber jar that tried earlier. It
> > does
> > > > not include above the jar. You can try removing
> > > 'org.apache.httpcomponents:httpclient'
> > > > and 'httpcore' from artifact include list and build new jar.
> > > >
> > > >
> > > > https://github.com/timveil/hive-jdbc-uber-jar
> > > >
> > > >
> > > > https://github.com/timveil/hive-jdbc-uber-jar/blob/
> master/pom.xml#L168
> > > >
> > > >
> > > > Also, see if hive 0.14 jdbc jar works.
> > > >
> > > >
> > > > https://github.com/timveil/hive-jdbc-uber-jar/releases/
> > > > download/v1.0-2.2.4.2/hive-jdbc-uber-2.2.4.2.jar'
> > > >
> > > >
> > > > Thanks,
> > > >
> > > >
> > > > Arjun
> > > >
> > > >
> > > > ________________________________
> > > > From: Asim Kanungo <asim...@gmail.com>
> > > > Sent: Wednesday, March 14, 2018 9:06 AM
> > > > To: user@drill.apache.org
> > > > Subject: Re: hive connection as generic jdbc
> > > >
> > > > Hi Arjun,
> > > >
> > > > With that I am getting error while starting the drill bit.
> > > > SLF4J: Class path contains multiple SLF4J bindings.
> > > > SLF4J: Found binding in
> > > > [jar:file:/opt/apache-drill-1.12.0/jars/3rdparty/hive-jdbc-
> > > > 1.1.1-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > > SLF4J: Found binding in
> > > > [jar:file:/opt/apache-drill-1.12.0/jars/classb/logback-
> > > > classic-1.0.13.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > > > SLF4J Error Codes<http://www.slf4j.org/codes.html#multiple_bindings>
> > > > www.slf4j.org<http://www.slf4j.org>
> > > > SLF4J warning or error messages and their meanings No SLF4J providers
> > > were
> > > > found. This warning, i.e. not an error, message is reported when no
> > SLF4J
> > > > ...
> > > >
> > > >
> > > >
> > > > explanation.
> > > > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> > > > SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on
> the
> > > > class path, preempting StackOverflowError.
> > > > SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop
> > for
> > > > more details.
> > > > Exception in thread "main" java.lang.ExceptionInInitializerError
> > > >         at org.apache.log4j.LogManager.getLogger(LogManager.java:44)
> > > >         at
> > > > org.slf4j.impl.Log4jLoggerFactory.getLogger(
> > Log4jLoggerFactory.java:66)
> > > >         at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:277)
> > > >         at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:288)
> > > >         at org.apache.drill.exec.server.Drillbit.<clinit>(Drillbit.
> > > > java:61)
> > > > Caused by: java.lang.IllegalStateException: Detected both
> > > > log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path,
> > preempting
> > > > StackOverflowError. See also
> > > > http://www.slf4j.org/codes.html#log4jDelegationLoop for more
> details.
> > > >         at
> > > > org.apache.log4j.Log4jLoggerFactory.<clinit>(
> > Log4jLoggerFactory.java:51)
> > > >         ... 5 more
> > > >
> > > > I have done some googling and found can be resolved by adding the
> > > > exclusion. But not sure which POM file has to be edited.
> > > > Please guide, if there is any other jar file I should use and/or if
> any
> > > > changes required.
> > > >
> > > > On Tue, Mar 13, 2018 at 1:22 PM, Arjun kr <arjun...@outlook.com>
> > wrote:
> > > >
> > > > > Hi Asim,
> > > > >
> > > > >
> > > > > Can you try using below jar? It looks like hive 1.2 onwards, Hive
> > uses
> > > > > httpclient version 4.4. The previous versions of Hive uses
> httpclient
> > > > > version 4.2.5. You may try with hive 1.1.1 standalone jar to see if
> > it
> > > > > helps.
> > > > >
> > > > >
> > > > > http://central.maven.org/maven2/org/apache/hive/hive-
> > > > > jdbc/1.1.1/hive-jdbc-1.1.1-standalone.jar
> > > > >
> > > > >
> > > > > Thanks,
> > > > >
> > > > > Arjun
> > > > >
> > > > > ________________________________
> > > > > From: Asim Kanungo <asim...@gmail.com>
> > > > > Sent: Tuesday, March 13, 2018 11:47 AM
> > > > > To: user@drill.apache.org
> > > > > Subject: Re: hive connection as generic jdbc
> > > > >
> > > > > Hi Arjun,
> > > > >
> > > > > I have tried it, but no luck. I am still getting the INSTANCE error
> > > > (Caused
> > > > > by: java.lang.NoSuchFieldError: INSTANCE).
> > > > > I am assuming it is happening for some version mismatch, I am poor
> in
> > > > Java
> > > > > but found an article given in the below link.
> > > > > Can you please suggest if we can do any changes to the script. I
> can
> > > > > recompile the code after the change and deploy the jar file for the
> > > test.
> > > > >
> > > > > https://github.com/qubole/streamx/issues/32
> > > > > [https://avatars0.githubusercontent.com/u/1134147?s=400&v=4
> > ]<https://
> > > > > github.com/qubole/streamx/issues/32>
> > > > >
> > > > > java.lang.NoSuchFieldError: INSTANCE exception, caused by ...<
> > > > > https://github.com/qubole/streamx/issues/32>
> > > > > github.com
> > > > > streamx - kafka-connect-s3 : Ingest data from Kafka to Object
> > > Stores(s3)
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Tue, Mar 13, 2018 at 12:30 AM, Arjun kr <arjun...@outlook.com>
> > > wrote:
> > > > >
> > > > > > Hi Asim,
> > > > > >
> > > > > >
> > > > > > You may give it a shot by adding this uber jar to Drill 3rd party
> > > > > > directory (Remove previously copied jars). For truststore, try
> > giving
> > > > > > absolute path. The test was to validate if hive uber jar works
> with
> > > > your
> > > > > > Hive setup.
> > > > > >
> > > > > >
> > > > > > Thanks,
> > > > > >
> > > > > >
> > > > > > Arjun
> > > > > >
> > > > > >
> > > > > > ________________________________
> > > > > > From: Asim Kanungo <asim...@gmail.com>
> > > > > > Sent: Tuesday, March 13, 2018 10:48 AM
> > > > > > To: user@drill.apache.org
> > > > > > Subject: Re: hive connection as generic jdbc
> > > > > >
> > > > > > Hi Arjun,
> > > > > >
> > > > > > I have tried with the hive jdbc uber jar and able to make a
> > > successful
> > > > > > connection.
> > > > > > java -cp
> > > > > > "hive-jdbc-uber-2.6.3.0-235.jar:sqlline-1.1.9-drill-r7.
> > > > > jar:jline-2.10.jar"
> > > > > > sqlline.SqlLine -d org.apache.hive.jdbc.HiveDriver -u
> > > > "jdbc:hive2://knox
> > > > > > server name:port/default;ssl=true;sslTrustStore=location and
> > > filename
> > > > of
> > > > > > jks file;trustStorePassword=********;transportMode=http;
> > > httpPath=path
> > > > "
> > > > > -n
> > > > > > username-p ******** -e "show tables;"
> > > > > >
> > > > > > As we have SSL enabled system so I have to give the extra details
> > in
> > > > the
> > > > > > URL and it worked. Does that mean it should work for adding it
> as a
> > > > > generic
> > > > > > JDBC.How is this test related to my issue ?
> > > > > >
> > > > > > Thanks
> > > > > > Asim
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Mon, Mar 12, 2018 at 10:36 PM, Arjun kr <arjun...@outlook.com
> >
> > > > wrote:
> > > > > >
> > > > > > > Hi Asim,
> > > > > > >
> > > > > > >
> > > > > > > You may try using hive uber jar in case you have not tried it.
> > See
> > > if
> > > > > > > below link helps.
> > > > > > >
> > > > > > >
> > > > > > > https://github.com/timveil/hive-jdbc-uber-jar/releases
> > > > > > [https://avatars0.githubusercontent.com/u/3260845?s=400&v=4
> > > ]<https://
> > > > > > github.com/timveil/hive-jdbc-uber-jar/releases>
> > > > > >
> > > > > > Releases · timveil/hive-jdbc-uber-jar · GitHub<
> https://github.com/
> > > > > > timveil/hive-jdbc-uber-jar/releases>
> > > > > > github.com
> > > > > > hive-jdbc-uber-jar - Hive JDBC "uber" or "standalone" jar based
> on
> > > the
> > > > > > latest Hortonworks Data Platform (HDP)
> > > > > >
> > > > > >
> > > > > >
> > > > > > >
> > > > > > >
> > > > > > > It would be ideal to test this uber jar with a sample JDBC
> > > > application
> > > > > > > before trying with Drill.
> > > > > > >
> > > > > > >
> > > > > > > java -cp "hive-jdbc-uber-2.6.3.0-235.
> > > jar:$DRILL_HOME/jars/3rdparty/
> > > > > > > sqlline-1.1.9-drill-r7.jar:$DRILL_HOME/jars/3rdparty/
> > > jline-2.10.jar"
> > > > > > > sqlline.SqlLine -d org.apache.hive.jdbc.HiveDriver -u "<JDBC
> > URL>"
> > > > -n
> > > > > > > <user> -p <password> -e "show tables;"
> > > > > > >
> > > > > > >
> > > > > > > Thanks,
> > > > > > >
> > > > > > >
> > > > > > > Arjun
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > ________________________________
> > > > > > > From: Asim Kanungo <asim...@gmail.com>
> > > > > > > Sent: Tuesday, March 13, 2018 8:37 AM
> > > > > > > To: user@drill.apache.org
> > > > > > > Subject: Re: hive connection as generic jdbc
> > > > > > >
> > > > > > > Thanks Kunal...
> > > > > > >
> > > > > > > Here are the details.
> > > > > > > {
> > > > > > >   "type": "jdbc",
> > > > > > >   "driver": "org.apache.hive.jdbc.HiveDriver",
> > > > > > >   "url": "jdbc:hive2://knox
> > > > > > > address:port/default?ssl=true&transportMode=http&httpPath=
> > > > > > > pathdetail&sslTrustStore=mytruststore.jks&
> > > > trustStorePassword=******",
> > > > > > >   "username": "XXXXXXX",
> > > > > > >   "password": "**********",
> > > > > > >   "enabled": true
> > > > > > > }
> > > > > > >
> > > > > > > Please note that we have a SSL enabled system, so have used the
> > > > > > Truststore
> > > > > > > settings.
> > > > > > > Please let me know if you need any more details.
> > > > > > >
> > > > > > > Thanks
> > > > > > > Asim
> > > > > > >
> > > > > > >
> > > > > > > On Sun, Mar 11, 2018 at 11:55 PM, Kunal Khatua <
> ku...@apache.org
> > >
> > > > > wrote:
> > > > > > >
> > > > > > > > I'm not sure how this is resolved since Hive is directly
> > accessed
> > > > by
> > > > > > > Drill
> > > > > > > > using the Hive storage plugin, instead of via the JDBC
> storage
> > > > > plugin.
> > > > > > > > Perhaps you can share the parameters of the JDBC storage
> plugin
> > > you
> > > > > > used,
> > > > > > > > so that folks more familiar with the JDBC storage plugin can
> > > help.
> > > > > > > > I'll see what I can find out in the meanwhile.
> > > > > > > > ~ Kunal
> > > > > > > >
> > > > > > > > On Sat, Mar 10, 2018 at 7:23 PM, Asim Kanungo <
> > asim...@gmail.com
> > > >
> > > > > > wrote:
> > > > > > > >
> > > > > > > > > Hi Kunal,
> > > > > > > > >
> > > > > > > > > I have tried the steps and getting the below error:-
> > > > > > > > >
> > > > > > > > > 2018-03-08 22:39:59,234 [qtp433826182-75] ERROR
> > > > > > > > > o.a.d.e.server.rest.StorageResources - Unable to create/
> > > update
> > > > > > > plugin:
> > > > > > > > > test
> > > > > > > > > org.apache.drill.common.exceptions.
> ExecutionSetupException:
> > > > > Failure
> > > > > > > > > setting
> > > > > > > > > up new storage plugin configuration for config
> > > > > > > > > org.apache.drill.exec.store.jdbc.JdbcStorageConfig@
> 8ef5d26f
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.store.StoragePluginRegistryImpl.
> > create(
> > > > > > > > > StoragePluginRegistryImpl.java:355)
> > > > > > > > > ~[drill-java-exec-1.12.0.jar:1.12.0]
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.store.StoragePluginRegistryImpl.
> > > > > > createOrUpdate(
> > > > > > > > > StoragePluginRegistryImpl.java:239)
> > > > > > > > > ~[drill-java-exec-1.12.0.jar:1.12.0]
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.server.rest.PluginConfigWrapper.
> > > > > > > > > createOrUpdateInStorage(PluginConfigWrapper.java:57)
> > > > > > > > > ~[drill-java-exec-1.12.0.jar:1.12.0]
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.server.rest.StorageResources.
> > > > > > > > > createOrUpdatePluginJSON(StorageResources.java:162)
> > > > > > > > > [drill-java-exec-1.12.0.jar:1.12.0]
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.server.rest.StorageResources.
> > > > > > > createOrUpdatePlugin(
> > > > > > > > > StorageResources.java:177)
> > > > > > > > > [drill-java-exec-1.12.0.jar:1.12.0]
> > > > > > > > >         at sun.reflect.NativeMethodAccessorImpl.
> > invoke0(Native
> > > > > > Method)
> > > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at
> > > > > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(
> > > > > > > > NativeMethodAccessorImpl.java:
> > > > > > > > > 62)
> > > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at
> > > > > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > > > > > > > > DelegatingMethodAccessorImpl.java:43)
> > > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at java.lang.reflect.Method.
> invoke(Method.java:497)
> > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.internal.
> > > > > > > > ResourceMethodInvocationHandle
> > > > > > > > > rFactory$1.invoke(ResourceMethodInvocationHandle
> > > > rFactory.java:81)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.internal.
> > > > > > > > AbstractJavaResourceMethodDisp
> > > > > > > > > atcher$1.run(AbstractJavaResourceMethodDisp
> atcher.java:151)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.internal.
> > > > > > > > AbstractJavaResourceMethodDisp
> > > > > > > > > atcher.invoke(AbstractJavaResourceMethodDisp
> atcher.java:171)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.internal.
> > > > > > > > JavaResourceMethodDispatcherPr
> > > > > > > > > ovider$TypeOutInvoker.doDispatch(
> > > JavaResourceMethodDispatcherPr
> > > > > > > > > ovider.java:195)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.internal.
> > > > > > > > AbstractJavaResourceMethodDisp
> > > > > > > > > atcher.dispatch(AbstractJavaResourceMethodDisp
> > atcher.java:104)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.ResourceMethodInvoker.
> > > > > > > > > invoke(ResourceMethodInvoker.java:387)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.ResourceMethodInvoker.
> > > > > > > > > apply(ResourceMethodInvoker.java:331)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.model.ResourceMethodInvoker.
> > > > > > > > > apply(ResourceMethodInvoker.java:103)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.ServerRuntime$1.run(
> > > > > > > ServerRuntime.java:269)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at org.glassfish.jersey.internal.
> > > > > > > Errors$1.call(Errors.java:271)
> > > > > > > > > [jersey-common-2.8.jar:na]
> > > > > > > > >         at org.glassfish.jersey.internal.
> > > > > > > Errors$1.call(Errors.java:267)
> > > > > > > > > [jersey-common-2.8.jar:na]
> > > > > > > > >         at org.glassfish.jersey.internal.
> > > > > Errors.process(Errors.java:
> > > > > > > 315)
> > > > > > > > > [jersey-common-2.8.jar:na]
> > > > > > > > >         at org.glassfish.jersey.internal.
> > > > > Errors.process(Errors.java:
> > > > > > > 297)
> > > > > > > > > [jersey-common-2.8.jar:na]
> > > > > > > > >         at org.glassfish.jersey.internal.
> > > > > Errors.process(Errors.java:
> > > > > > > 267)
> > > > > > > > > [jersey-common-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.process.internal.RequestScope.
> > > > > > > > > runInScope(RequestScope.java:297)
> > > > > > > > > [jersey-common-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.ServerRuntime.process(
> > > > > > > > ServerRuntime.java:252)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.server.ApplicationHandler.handle(
> > > > > > > > > ApplicationHandler.java:1023)
> > > > > > > > > [jersey-server-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.servlet.WebComponent.service(
> > > > > > > WebComponent.java:372)
> > > > > > > > > [jersey-container-servlet-core-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.servlet.ServletContainer.service(
> > > > > > > > > ServletContainer.java:382)
> > > > > > > > > [jersey-container-servlet-core-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.servlet.ServletContainer.service(
> > > > > > > > > ServletContainer.java:345)
> > > > > > > > > [jersey-container-servlet-core-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.glassfish.jersey.servlet.ServletContainer.service(
> > > > > > > > > ServletContainer.java:220)
> > > > > > > > > [jersey-container-servlet-core-2.8.jar:na]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.servlet.ServletHolder.handle(
> > > > > > ServletHolder.java:738)
> > > > > > > > > [jetty-servlet-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.servlet.ServletHandler.doHandle(
> > > > > > > > ServletHandler.java:551)
> > > > > > > > > [jetty-servlet-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.session.SessionHandler.
> > > > > > > > > doHandle(SessionHandler.java:219)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.handler.ContextHandler.
> > > > > > > > > doHandle(ContextHandler.java:1111)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.servlet.ServletHandler.doScope(
> > > > > > > > ServletHandler.java:478)
> > > > > > > > > [jetty-servlet-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.session.SessionHandler.
> > > > > > > > > doScope(SessionHandler.java:183)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.handler.ContextHandler.
> > > > > > > > > doScope(ContextHandler.java:1045)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.handler.ScopedHandler.handle(
> > > > > > > > > ScopedHandler.java:141)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.handler.HandlerWrapper.handle(
> > > > > > > > > HandlerWrapper.java:97)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at org.eclipse.jetty.server.
> > > > Server.handle(Server.java:462)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.HttpChannel.handle(
> > > > HttpChannel.java:279)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.server.HttpConnection.onFillable(
> > > > > > > > > HttpConnection.java:232)
> > > > > > > > > [jetty-server-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.io.AbstractConnection$2.run(
> > > > > > > > AbstractConnection.java:534)
> > > > > > > > > [jetty-io-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(
> > > > > > > > > QueuedThreadPool.java:607)
> > > > > > > > > [jetty-util-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at
> > > > > > > > > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(
> > > > > > > > > QueuedThreadPool.java:536)
> > > > > > > > > [jetty-util-9.1.5.v20140505.jar:9.1.5.v20140505]
> > > > > > > > >         at java.lang.Thread.run(Thread.java:745)
> > [na:1.8.0_73]
> > > > > > > > > *Caused by: java.lang.NoSuchFieldError: INSTANCE*
> > > > > > > > >         at
> > > > > > > > > org.apache.http.conn.ssl.SSLConnectionSocketFactory.<
> > clinit>(
> > > > > > > > > SSLConnectionSocketFactory.java:144)
> > > > > > > > > ~[hive-jdbc.jar:1.2.1000.2.5.3.0-37]
> > > > > > > > >         at
> > > > > > > > > org.apache.hive.jdbc.HiveConnection.getHttpClient(
> > > > > > > > HiveConnection.java:365)
> > > > > > > > > ~[hive-jdbc.jar:1.2.1000.2.5.3.0-37]
> > > > > > > > >         at
> > > > > > > > > org.apache.hive.jdbc.HiveConnection.createHttpTransport(
> > > > > > > > > HiveConnection.java:244)
> > > > > > > > > ~[hive-jdbc.jar:1.2.1000.2.5.3.0-37]
> > > > > > > > >         at
> > > > > > > > > org.apache.hive.jdbc.HiveConnection.openTransport(
> > > > > > > > HiveConnection.java:191)
> > > > > > > > > ~[hive-jdbc.jar:1.2.1000.2.5.3.0-37]
> > > > > > > > >         at
> > > > > > > > > org.apache.hive.jdbc.HiveConnection.<init>(
> > > > > HiveConnection.java:155)
> > > > > > > > > ~[hive-jdbc.jar:1.2.1000.2.5.3.0-37]
> > > > > > > > >         at org.apache.hive.jdbc.
> > HiveDriver.connect(HiveDriver.
> > > > > > > java:105)
> > > > > > > > > ~[hive-jdbc.jar:1.2.1000.2.5.3.0-37]
> > > > > > > > >         at
> > > > > > > > > org.apache.commons.dbcp.DriverConnectionFactory.
> > > > createConnection(
> > > > > > > > > DriverConnectionFactory.java:38)
> > > > > > > > > ~[commons-dbcp-1.4.jar:1.4]
> > > > > > > > >         at
> > > > > > > > > org.apache.commons.dbcp.PoolableConnectionFactory.
> > makeObject(
> > > > > > > > > PoolableConnectionFactory.java:582)
> > > > > > > > > ~[commons-dbcp-1.4.jar:1.4]
> > > > > > > > >         at
> > > > > > > > > org.apache.commons.dbcp.BasicDataSource.
> > > > validateConnectionFactory(
> > > > > > > > > BasicDataSource.java:1556)
> > > > > > > > > ~[commons-dbcp-1.4.jar:1.4]
> > > > > > > > >         at
> > > > > > > > > org.apache.commons.dbcp.BasicDataSource.
> > > > > > createPoolableConnectionFactor
> > > > > > > > > y(BasicDataSource.java:1545)
> > > > > > > > > ~[commons-dbcp-1.4.jar:1.4]
> > > > > > > > >         at
> > > > > > > > > org.apache.commons.dbcp.BasicDataSource.createDataSource(
> > > > > > > > > BasicDataSource.java:1388)
> > > > > > > > > ~[commons-dbcp-1.4.jar:1.4]
> > > > > > > > >         at
> > > > > > > > > org.apache.commons.dbcp.BasicDataSource.getConnection(
> > > > > > > > > BasicDataSource.java:1044)
> > > > > > > > > ~[commons-dbcp-1.4.jar:1.4]
> > > > > > > > >         at
> > > > > > > > > org.apache.calcite.adapter.jdbc.JdbcUtils$DialectPool.
> > > > > > > > > get(JdbcUtils.java:73)
> > > > > > > > > ~[calcite-core-1.4.0-drill-r23.jar:1.4.0-drill-r23]
> > > > > > > > >         at
> > > > > > > > > org.apache.calcite.adapter.jdbc.JdbcSchema.createDialect(
> > > > > > > > > JdbcSchema.java:138)
> > > > > > > > > ~[calcite-core-1.4.0-drill-r23.jar:1.4.0-drill-r23]
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.store.jdbc.JdbcStoragePlugin.<init>(
> > > > > > > > > JdbcStoragePlugin.java:103)
> > > > > > > > > ~[drill-jdbc-storage-1.12.0.jar:1.12.0]
> > > > > > > > >         at sun.reflect.NativeConstructorAccessorImpl.
> > > > > > > newInstance0(Native
> > > > > > > > > Method) ~[na:1.8.0_73]
> > > > > > > > >         at
> > > > > > > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > > > > > > > > NativeConstructorAccessorImpl.java:62)
> > > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at
> > > > > > > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > > > > > > > > DelegatingConstructorAccessorImpl.java:45)
> > > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at java.lang.reflect.Constructor.
> > > > > > newInstance(Constructor.java:
> > > > > > > > 422)
> > > > > > > > > ~[na:1.8.0_73]
> > > > > > > > >         at
> > > > > > > > > org.apache.drill.exec.store.StoragePluginRegistryImpl.
> > create(
> > > > > > > > > StoragePluginRegistryImpl.java:346)
> > > > > > > > > ~[drill-java-exec-1.12.0.jar:1.12.0]
> > > > > > > > >         ... 45 common frames omitted
> > > > > > > > >
> > > > > > > > > The JAR file for Hive and HTTP is added. and the
> > > > > > > > > "drill.exec.sys.store.provider.local.path" is set in
> > > > > > > > > the drill-override.conf file. After searching in google I
> > found
> > > > > that
> > > > > > > this
> > > > > > > > > issue occurs when there is a version mismatch or if one
> class
> > > is
> > > > > > > present
> > > > > > > > in
> > > > > > > > > two or more JAR files. I have not very much idea in Java,
> so
> > > can
> > > > > you
> > > > > > > > please
> > > > > > > > > let me know any particular JAR which has to be removed or
> > added
> > > > to
> > > > > > > > resolve
> > > > > > > > > this issue.
> > > > > > > > > Thanks again for your time and help !!!!
> > > > > > > > >
> > > > > > > > > Thanks
> > > > > > > > > Asim
> > > > > > > > >
> > > > > > > > > On Wed, Mar 7, 2018 at 10:24 PM, Asim Kanungo <
> > > asim...@gmail.com
> > > > >
> > > > > > > wrote:
> > > > > > > > >
> > > > > > > > > > Thanks Kunal for clarifying...
> > > > > > > > > > I am still learning the things, so as per my first
> project
> > I
> > > am
> > > > > > > trying
> > > > > > > > to
> > > > > > > > > > create a successful connection.
> > > > > > > > > > I will work on the optimization in the second phase.
> > > > > > > > > > Thanks for your valuable tips, let me try to create the
> > hive
> > > > > > > connection
> > > > > > > > > > through JDBC then.
> > > > > > > > > > I suppose I need to put the hive jdbc drivers in the 3rd
> > > party
> > > > > > > > directory,
> > > > > > > > > > please let me know if you have the list of the drivers or
> > > jar I
> > > > > > need
> > > > > > > to
> > > > > > > > > put
> > > > > > > > > > in the 3rd party directory.
> > > > > > > > > >
> > > > > > > > > > Thanks
> > > > > > > > > > Asim
> > > > > > > > > >
> > > > > > > > > > On Wed, Mar 7, 2018 at 6:06 PM, Kunal Khatua <
> > > > > > kunalkha...@gmail.com>
> > > > > > > > > > wrote:
> > > > > > > > > >
> > > > > > > > > >> You should be able to connect to a Hive cluster via
> JDBC.
> > > > > However,
> > > > > > > the
> > > > > > > > > >> benefit of using Drill co-located on the same cluster is
> > > that
> > > > > > Drill
> > > > > > > > can
> > > > > > > > > >> directly access the data based on locality information
> > from
> > > > Hive
> > > > > > and
> > > > > > > > > >> process across the distributed FS cluster.
> > > > > > > > > >>
> > > > > > > > > >> With JDBC, any filters you have, will (most probably)
> not
> > be
> > > > > > pushed
> > > > > > > > down
> > > > > > > > > >> to Hive. So, you'll end up loading the unfiltered data
> > > > through a
> > > > > > > > single
> > > > > > > > > >> data channel from the Hive cluster, into your Drill
> > cluster,
> > > > > > before
> > > > > > > it
> > > > > > > > > can
> > > > > > > > > >> start processing.
> > > > > > > > > >>
> > > > > > > > > >> If using JDBC is the only option, it might be worth
> using
> > > the
> > > > > > > 'create
> > > > > > > > > >> table as' (or the temporary table variant) to offload
> that
> > > > data
> > > > > > into
> > > > > > > > > your
> > > > > > > > > >> Drill cluster and then execute your analytical queries
> > > against
> > > > > > this
> > > > > > > > > >> offloaded dataset.
> > > > > > > > > >>
> > > > > > > > > >> On 3/7/2018 2:46:55 PM, Asim Kanungo <asim...@gmail.com
> >
> > > > wrote:
> > > > > > > > > >> Hi Team,
> > > > > > > > > >>
> > > > > > > > > >> Can I connect to hive database as a generic jdbc
> protocol
> > > > like I
> > > > > > am
> > > > > > > > > doing
> > > > > > > > > >> for other RDBMS.Or DRILL can only connect to hive
> residing
> > > in
> > > > > the
> > > > > > > same
> > > > > > > > > >> cluster where Hadoop is installed.
> > > > > > > > > >>
> > > > > > > > > >> I am talking about the cases where DRILL is installed in
> > one
> > > > > > cluster
> > > > > > > > and
> > > > > > > > > >> my
> > > > > > > > > >> Hadoop cluster is different. Can you please guide if I
> can
> > > > > connect
> > > > > > > to
> > > > > > > > > that
> > > > > > > > > >> hive, using jdbc protocol.
> > > > > > > > > >> I have tried, but it failed with the error "unable to
> > > > add/update
> > > > > > > > > storage".
> > > > > > > > > >> I am not sure why it has failed as it is not giving any
> > > other
> > > > > > > > messages,
> > > > > > > > > >> and
> > > > > > > > > >> I know version 13 will contain more message.
> > > > > > > > > >>
> > > > > > > > > >> Please advise.
> > > > > > > > > >>
> > > > > > > > > >> Thanks
> > > > > > > > > >> Asim
> > > > > > > > > >>
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to