[
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15037296#comment-15037296
]
WEI ZHU edited comment on SPARK-10528 at 12/3/15 5:43 AM:
----------------------------------------------------------
I came to the same problem when using precompiled binaries - 1.5.2-hadoop 2.6.
I am doing it on CentOS 6.5.
when I run ./spark-shell, it throws the following exception. and I found the
./pyspark working well for me.
hive.metastore.schema.verification is not enabled so recording the schema
version 1.2.0
15/12/03 02:04:08 WARN ObjectStore: Failed to get database default, returning
NoSuchObjectException
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir:
/tmp/hive on HDFS should be writable. Current permissions are: rwx------
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at
org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
p.s. the permissions of hdfs folder look okay to me as below.
[root@sandbox bin]# hadoop fs -ls /tmp
Found 3 items
drwxrwxrwx - hive hdfs 0 2015-12-01 05:35 /tmp/hive
drwxrwxrwx - root hdfs 0 2015-08-12 00:37 /tmp/temp-1409396078
drwxrwxrwx - hdfs hdfs 0 2014-12-16 19:30 /tmp/udfs
was (Author: microwishing):
I came to the same problem when using precompiled binaries - 1.5.2-hadoop 2.6
when I run ./spark-shell, it throws the following exception. and I found the
./pyspark working well for me. is this a bug?
hive.metastore.schema.verification is not enabled so recording the schema
version 1.2.0
15/12/03 02:04:08 WARN ObjectStore: Failed to get database default, returning
NoSuchObjectException
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir:
/tmp/hive on HDFS should be writable. Current permissions are: rwx------
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at
org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
p.s. the permissions of hdfs folder look okay to me as below.
[root@sandbox bin]# hadoop fs -ls /tmp
Found 3 items
drwxrwxrwx - hive hdfs 0 2015-12-01 05:35 /tmp/hive
drwxrwxrwx - root hdfs 0 2015-08-12 00:37 /tmp/temp-1409396078
drwxrwxrwx - hdfs hdfs 0 2014-12-16 19:30 /tmp/udfs
> spark-shell throws java.lang.RuntimeException: The root scratch dir:
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
> Key: SPARK-10528
> URL: https://issues.apache.org/jira/browse/SPARK-10528
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.5.0
> Environment: Windows 7 x64
> Reporter: Aliaksei Belablotski
> Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir:
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]