[ 
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15114080#comment-15114080
 ] 

Alexander Ulanov edited comment on SPARK-10528 at 1/24/16 1:30 AM:
-------------------------------------------------------------------

Hi! I'm getting the same problem on Windows 7 64x with Spark 1.6.0. It worked 
with the earlier versions of Spark. Changing permissions do not help. Spark 
launches eventually with that error and does not provide sqlContext. I've 
checked Spark 1.4.1 and it worked fine.

Is there a workaround? 


was (Author: avulanov):
Hi! I'm getting the same problem on Windows 7 64x with Spark 1.6.0. It worked 
with the earlier versions of Spark. Changing permissions do not help. Is there 
a workaround? 

> spark-shell throws java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10528
>                 URL: https://issues.apache.org/jira/browse/SPARK-10528
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Windows 7 x64
>            Reporter: Aliaksei Belablotski
>            Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to