Hello, there seems to be missing support for some operations in spark SQL thrift server. To be more specific - when connected to our spark SQL instance (1.5.1, standallone deployment) from standard jdbc sql client (squirrel SQL and few others) via the thrift server, sql query processing seem to work fine except of one thing - sql client is unable to fetch the list of tables (objects) present in the spark SQL instance. Tables can be used in sql queries, which work perfect, but are not visible in the list of objects, which displays single database/tablespace (default) containing empty sub-folders (INDEX_TABLE, TABLE, VIEW, UDT). This makes advanced features of sql client (like code completion and syntax highlighting) not to work as expected and writing sql queries is not as comfortable as with ordinary DBMS.
After a short debugging session I found out, that problem seems to be in the implementation of org.apache.spark.sql.hive.thriftserver.server.SparkSQLOperationManager, which overrides only single kind of operation - execute statement, leaving other operations to their default Hive implementations (returning empty database in our case). SQL clients seem to be using standard JDBC API calls to query database tables and their metadata which will end up in org.apache.hive.service.cli.operation.OperationManager (others than execute statement - get catalogs, get tables,...) and as there is no Spark SQL-specific operation, tables registered within Spark SQL are not visible in those calls. Is it a known/desired behavior, or is there any plan to add the implementation of the missing jdbc operations to the SparkSQLOperationManager to provide the full-featured JDBC functionality? Thank You for Your answer (and for the great work You are doing there). Best regards Rastislav Krist -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-SQL-thrift-server-support-for-more-features-via-jdbc-table-catalog-tp25155.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org