[ 
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15093151#comment-15093151
 ] 

Amir Gur edited comment on SPARK-10528 at 1/15/16 5:41 PM:
-----------------------------------------------------------

Should this not be reopened given is still happens to many folks as the last 
recent comments suggest?

[~srowen] said it is env issue (at 
https://issues.apache.org/jira/browse/SPARK-10528?focusedCommentId=14958759&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14958759)
 and closed it.  Some posted workarounds which solved for them.  For me non of 
those worked.

To reproduce, getting the same with either spark-Shell --master local[2] or 
using a plain maven project to which I added the HiveFromSpark example from the 
spark codebase, running on win8x64, just picking spark 1.5.2 (or 1.6.0), 
spark-hive_2.11 (or 2.10), and running with sparkConf.setMaster("local").

I got a debug session opened and showing it is coming from the {{throw new 
RuntimeException}} at 
org.apache.hadoop.hive.ql.session.SessionState#createRootHDFSDir at line 612 of 
org/spark-project/hive/hive-exec/1.2.1.spark/hive-exec-1.2.1.spark-sources.jar, 
which is:

{code}
    // If the root HDFS scratch dir already exists, make sure it is writeable.
    if (!((currentHDFSDirPermission.toShort() & writableHDFSDirPermission
        .toShort()) == writableHDFSDirPermission.toShort())) {
      throw new RuntimeException("The root scratch dir: " + rootHDFSDirPath
          + " on HDFS should be writable. Current permissions are: " + 
currentHDFSDirPermission);
    }
{code}


was (Author: agur):
Should this not be reopened given is still happens to many folks as the last 
recent comments suggest?

[~srowen] said it is env issue (at 
https://issues.apache.org/jira/browse/SPARK-10528?focusedCommentId=14958759&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14958759)
 and closed it.  Some posted workarounds which solved for them.  For me non of 
those worked.

To reproduce, getting the same with either spark-Shell --master local[2] or 
using a plain maven project to which I added the HiveFromSpark example from the 
spark codebase, running on win8x64, just picking spark 1.5.2 (or 1.6.0), 
spark-hive_2.11 (or 2.10), and running with sparkConf.setMaster("local").

> spark-shell throws java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10528
>                 URL: https://issues.apache.org/jira/browse/SPARK-10528
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Windows 7 x64
>            Reporter: Aliaksei Belablotski
>            Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to