Is it possible to use our own metastore instead of Hive Metastore with Spark SQL?
Can you please point me to some docs or code I can look at to get it done? We are moving away from everything Hadoop. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org