Hi Mohammed, I think this is something you can do at the Thrift server startup. So this would run an instance of Derby and act as a Metastore. Any idea if this Debry Metastore will have distributed access and why do we use the Hive Metastore then?
@Angela: I would also be happy to have a metastore owned by Spark Thrift Server. What are you trying to achieve by using the Thrift server without Hive? Regards, Sambit. -----Original Message----- From: Mohammed Guller [mailto:moham...@glassbeam.com] Sent: Wednesday, January 13, 2016 2:54 PM To: angela.whelan <angela.whe...@synchronoss.com>; user@spark.apache.org Subject: RE: Is it possible to use SparkSQL JDBC ThriftServer without Hive Hi Angela, Yes, you can use Spark SQL JDBC/ThriftServer without Hive. Mohammed -----Original Message----- From: angela.whelan [mailto:angela.whe...@synchronoss.com] Sent: Wednesday, January 13, 2016 3:37 AM To: user@spark.apache.org Subject: Is it possible to use SparkSQL JDBC ThriftServer without Hive hi, I'm wondering if it is possible to use the SparkSQL JDBC ThriftServer without Hive? The reason I'm asking is that we are unsure about the speed of Hive with SparkSQL JDBC connectivity. I can't find any article online about using SparkSQL JDBC ThriftServer without Hive. Many thanks in advance for any help on this. Thanks, Angela -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-use-SparkSQL-JDBC-ThriftServer-without-Hive-tp25959.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org