[ 
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14739499#comment-14739499
 ] 

Aliaksei Belablotski commented on SPARK-10528:
----------------------------------------------

Thanks a lot Marcelo. Yes, Windows is slightly different.
But, I have 1.4.1 running without this issue. The Spark Shell log is clear - no 
any errors.
The 1.4.1 and 1.5.0 are located in the same drive on my laptop.

Plus, even after throwing exception, Spark Shell 1.5.0 works fine:
I'm able to create/manipulate RDDs in REPL console.
Maybe there will be some problems with big datasets, but for now - it works.

> spark-shell throws java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10528
>                 URL: https://issues.apache.org/jira/browse/SPARK-10528
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Windows 7 x64
>            Reporter: Aliaksei Belablotski
>            Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to