[ 
https://issues.apache.org/jira/browse/SPARK-9686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15010244#comment-15010244
 ] 

Navis edited comment on SPARK-9686 at 11/18/15 5:06 AM:
--------------------------------------------------------

[~lian cheng]  Sorry, I've confused "remote metastore" with "remote database". 
I'm using local metastore, without hive.metastore.uris setting.

And,
bq. We should override corresponding methods in SparkSQLCLIService and dispatch 
these JDBC calls to the metastore Hive client.
The patch attached is exactly for that, with configuration replacement for 
asserting valid metastore configuration (without this, Hive.get() destroys 
connection and make new one with dummy derby).


was (Author: navis):
@Cheng Lian  Sorry, I've confused "remote metastore" with "remote database". 
I'm using local metastore, without hive.metastore.uris setting.

bq. We should override corresponding methods in SparkSQLCLIService and dispatch 
these JDBC calls to the metastore Hive client.
The patch attached is exactly for that, with configuration replacement for 
asserting valid metastore configuration (without this, Hive.get() destroys 
connection and make new one with dummy derby).

> Spark Thrift server doesn't return correct JDBC metadata 
> ---------------------------------------------------------
>
>                 Key: SPARK-9686
>                 URL: https://issues.apache.org/jira/browse/SPARK-9686
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0, 1.4.1, 1.5.0, 1.5.1, 1.5.2
>            Reporter: pin_zhang
>            Assignee: Cheng Lian
>         Attachments: SPARK-9686.1.patch.txt
>
>
> 1. Start  start-thriftserver.sh
> 2. connect with beeline
> 3. create table
> 4.show tables, the new created table returned
> 5.
>       Class.forName("org.apache.hive.jdbc.HiveDriver");
>       String URL = "jdbc:hive2://localhost:10000/default";
>        Properties info = new Properties();
>         Connection conn = DriverManager.getConnection(URL, info);
>       ResultSet tables = conn.getMetaData().getTables(conn.getCatalog(),
>                null, null, null);
> Problem:
>            No tables with returned this API, that work in spark1.3



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to