This should work. Which version of Spark are you using? Here is what I do
-- make sure hive-site.xml is in the conf directory of the machine you're
using the driver from. Now let's run spark-shell from that machine:

scala> val hc= new org.apache.spark.sql.hive.HiveContext(sc)
hc: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@6e9f8f26

scala> hc.sql("show tables").collect
15/05/15 09:34:17 INFO metastore: Trying to connect to metastore with
URI thrift://hostname.com:9083              <-- here should be a value
from your hive-site.xml
15/05/15 09:34:17 INFO metastore: Waiting 1 seconds before next
connection attempt.
15/05/15 09:34:18 INFO metastore: Connected to metastore.
res0: Array[org.apache.spark.sql.Row] = Array([table1,false],

scala> hc.getConf("hive.metastore.uris")
res13: String = thrift://hostname.com:9083

scala> hc.getConf("hive.metastore.warehouse.dir")
res14: String = /user/hive/warehouse

​

The first line tells you which metastore it's trying to connect to -- this
should be the string specified under hive.metastore.uris property in your
hive-site.xml file. I have not mucked with warehouse.dir too much but I
know that the value of the metastore URI is in fact picked up from there as
I regularly point to different systems...


On Thu, May 14, 2015 at 6:26 PM, Tamas Jambor <jambo...@gmail.com> wrote:

> I have tried to put the hive-site.xml file in the conf/ directory with,
> seems it is not picking up from there.
>
>
> On Thu, May 14, 2015 at 6:50 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> You can configure Spark SQLs hive interaction by placing a hive-site.xml
>> file in the conf/ directory.
>>
>> On Thu, May 14, 2015 at 10:24 AM, jamborta <jambo...@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> is it possible to set hive.metastore.warehouse.dir, that is internally
>>> create by spark, to be stored externally (e.g. s3 on aws or wasb on
>>> azure)?
>>>
>>> thanks,
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/store-hive-metastore-on-persistent-store-tp22891.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to