[
https://issues.apache.org/jira/browse/SPARK-17918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-17918.
-------------------------------
Resolution: Duplicate
> Default Warehouse location apparently in HDFS
> ----------------------------------------------
>
> Key: SPARK-17918
> URL: https://issues.apache.org/jira/browse/SPARK-17918
> Project: Spark
> Issue Type: Bug
> Affects Versions: 2.0.1
> Environment: Mac OS X 10.11.6
> Reporter: Alessio
>
> It seems that the default warehouse location in Spark 2.0.1 not only points
> at an inexistent folder in Macintosh systems (/user/hive/warehouse) - see
> first INFO - but also such folder is then appended to an HDFS - see the error.
> This was fixed in 2.0.0, as previous issues reported, but appears again in
> 2.0.1. Indeed some scripts I was able to run in 2.0.0 now throw such errors:
> Spark 2.0.0 used to create the spark-warehouse folder within the current
> directory (which was good) and didn't complain about such weird paths, even
> because I'm not using Spark though HDFS, but just locally.
> *16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is
> '/user/hive/warehouse'.*
> *py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.*
> *: org.apache.spark.SparkException: Unable to create database default as
> failed to create its directory* *hdfs://localhost:9000/user/hive/warehouse*
> {color:red}Update #1:{color}
> I was able to reinstall Spark 2.0.0 and the first INFO message clearly states
> that
> *16/10/13 21:06:59 INFO internal.SharedState: Warehouse path is 'file:/<local
> FS folder>/spark-warehouse'.*
> {color:red}Update #2:{color}
> In both Spark 2.0.0 and 2.0.1 I didn't edit any config file and the like.
> Everything's default.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]