[ 
https://issues.apache.org/jira/browse/SPARK-9686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679796#comment-16679796
 ] 

Yuming Wang commented on SPARK-9686:
------------------------------------

This is my fix:
Implement Spark's own GetSchemasOperation: 
[https://github.com/apache/spark/pull/22903]
Implement Spark's own GetTablesOperation: 
[https://github.com/apache/spark/pull/22794]
Implement Spark's own GetColumnsOperation: 
[https://github.com/wangyum/spark/blob/SPARK-24570-DBVisualizer/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetColumnsOperation.scala]
 

> Spark Thrift server doesn't return correct JDBC metadata 
> ---------------------------------------------------------
>
>                 Key: SPARK-9686
>                 URL: https://issues.apache.org/jira/browse/SPARK-9686
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0, 1.4.1, 1.5.0, 1.5.1, 1.5.2
>            Reporter: pin_zhang
>            Priority: Critical
>         Attachments: SPARK-9686.1.patch.txt
>
>
> 1. Start  start-thriftserver.sh
> 2. connect with beeline
> 3. create table
> 4.show tables, the new created table returned
> 5.
>       Class.forName("org.apache.hive.jdbc.HiveDriver");
>       String URL = "jdbc:hive2://localhost:10000/default";
>        Properties info = new Properties();
>         Connection conn = DriverManager.getConnection(URL, info);
>       ResultSet tables = conn.getMetaData().getTables(conn.getCatalog(),
>                null, null, null);
> Problem:
>            No tables with returned this API, that work in spark1.3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to