[ 
https://issues.apache.org/jira/browse/SPARK-15034?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15301518#comment-15301518
 ] 

Yin Huai commented on SPARK-15034:
----------------------------------

Seems your default file system is hdfs? Since we need a warehouse dir to store 
tables, in Spark 2.0, we add a flag spark.sql.warehouse.dir to set the 
warehouse location. The default value of {{spark.sql.warehouse.dir}} is 
{{System.getProperty("user.dir")/spark-warehouse}}. For your case, I feel 
probably you have a hadoop's core-site.xml file in the classpath with hdfs set 
as the default filesystem? Then, after we qualify the path, we get 
{{hdfs://namenode:8020/usr/local/lib/spark-2.0.0-SNAPSHOT-bin-cdh5.4/spark-warehouse}}.

> Use the value of spark.sql.warehouse.dir as the warehouse location instead of 
> using hive.metastore.warehouse.dir
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-15034
>                 URL: https://issues.apache.org/jira/browse/SPARK-15034
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>              Labels: release_notes, releasenotes
>             Fix For: 2.0.0
>
>
> Starting from Spark 2.0, spark.sql.warehouse.dir will be the conf to set 
> warehouse location. We will not use hive.metastore.warehouse.dir.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to