Seems like you have "hive.server2.enable.doAs" enabled; you can either
disable it, or configure hs2 so that the user running the service
("hadoop" in your case) can impersonate others.
See:
https://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/Superusers.html
On Fri, Sep 25, 201
exited with code 1.
-Original Message-
From: Marcelo Vanzin [mailto:van...@cloudera.com]
Sent: Friday, September 25, 2015 1:12 PM
To: Garry Chen
Cc: Jimmy Xiang ; user@spark.apache.org
Subject: Re: hive on spark query error
On Fri, Sep 25, 2015 at 10:05 AM, Garry Chen wrote:
> In sp
On Fri, Sep 25, 2015 at 10:05 AM, Garry Chen wrote:
> In spark-defaults.conf the spark.master is spark://hostname:7077. From
> hive-site.xml
> spark.master
> hostname
>
That's not a valid value for spark.master (as the error indicates).
You should set it to "spark://hostname:7077"
In spark-defaults.conf the spark.master is spark://hostname:7077. From
hive-site.xml
spark.master
hostname
From: Jimmy Xiang [mailto:jxi...@cloudera.com]
Sent: Friday, September 25, 2015 1:00 PM
To: Garry Chen
Cc: user@spark.apache.org
Subject: Re: hive on spark query error
> Error: Master must start with yarn, spark, mesos, or local
What's your setting for spark.master?
On Fri, Sep 25, 2015 at 9:56 AM, Garry Chen wrote:
> Hi All,
>
> I am following
> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started?
> to setup hive