[ 
https://issues.apache.org/jira/browse/SPARK-17918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alessio updated SPARK-17918:
----------------------------
    Description: 
It seems that the default warehouse location in Spark 2.0.1 not only points at 
an inexistent folder in Macintosh systems (/user/hive/warehouse)  - see first 
INFO - but also such folder is then appended to an HDFS - see the error.

This was fixed in 2.0.0, as previous issues reported, but appears again in 
2.0.1. Indeed some scripts I was able to run in 2.0.0 now throw such errors: 
Spark 2.0.0 used to create the spark-warehouse folder within the current 
directory (which was good) and didn't complain about such weird paths, even 
because I'm not using Spark though HDFS, but just locally.


*16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is 
'/user/hive/warehouse'.*

*py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.*
*: org.apache.spark.SparkException: Unable to create database default as failed 
to create its directory* *hdfs://localhost:9000/user/hive/warehouse*

Update #1:
I was able to reinstall Spark 2.0.0 and the first INFO message clearly states 
that 
*16/10/13 21:06:59 INFO internal.SharedState: Warehouse path is 'file:/<local 
FS folder>/spark-warehouse'.*


  was:
It seems that the default warehouse location in Spark 2.0.1 not only points at 
an inexistent folder in Macintosh systems (/user/hive/warehouse)  - see first 
INFO - but also such folder is then appended to an HDFS - see the error.

This was fixed in 2.0.0, as previous issues reported, but appears again in 
2.0.1. Indeed some scripts I was able to run in 2.0.0 now throw such errors: 
Spark 2.0.0 used to create the spark-warehouse folder within the current 
directory (which was good) and didn't complain about such weird paths, even 
because I'm not using Spark though HDFS, but just locally.


*16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is 
'/user/hive/warehouse'.*

*py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.*
*: org.apache.spark.SparkException: Unable to create database default as failed 
to create its directory* *hdfs://localhost:9000/user/hive/warehouse*

Update #1:
I was able to reinstall Spark 2.0.0 and the first INFO message clearly states 
that 
*16/10/13 21:06:59 INFO internal.SharedState: Warehouse path is 
'file:/Users/Purple/Documents/YARNprojects/Spark_K-MEANS/version_postgreSQL/spark-warehouse'.*



> Default Warehouse location apparently in HDFS 
> ----------------------------------------------
>
>                 Key: SPARK-17918
>                 URL: https://issues.apache.org/jira/browse/SPARK-17918
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.1
>         Environment: Mac OS X 10.11.6
>            Reporter: Alessio
>
> It seems that the default warehouse location in Spark 2.0.1 not only points 
> at an inexistent folder in Macintosh systems (/user/hive/warehouse)  - see 
> first INFO - but also such folder is then appended to an HDFS - see the error.
> This was fixed in 2.0.0, as previous issues reported, but appears again in 
> 2.0.1. Indeed some scripts I was able to run in 2.0.0 now throw such errors: 
> Spark 2.0.0 used to create the spark-warehouse folder within the current 
> directory (which was good) and didn't complain about such weird paths, even 
> because I'm not using Spark though HDFS, but just locally.
> *16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is 
> '/user/hive/warehouse'.*
> *py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.*
> *: org.apache.spark.SparkException: Unable to create database default as 
> failed to create its directory* *hdfs://localhost:9000/user/hive/warehouse*
> Update #1:
> I was able to reinstall Spark 2.0.0 and the first INFO message clearly states 
> that 
> *16/10/13 21:06:59 INFO internal.SharedState: Warehouse path is 'file:/<local 
> FS folder>/spark-warehouse'.*



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to