t;>;
Send time: Thursday, Feb 12, 2015 2:00 AM
To: ""mailto:guxiaobo1...@qq.com>>;
Cc:
"user@spark.apache.org<mailto:user@spark.apache.org>"mailto:user@spark.apache.org>>;
"Cheng Lian"mailto:lian.cs@gmail.com>>;
Subject: Re: Can't acc
Zhan Zhang";;
Send time: Thursday, Feb 12, 2015 2:00 AM
To: "";
Cc: "user@spark.apache.org"; "Cheng
Lian";
Subject: Re: Can't access remote Hive table from spark
You need to have right hdfs account, e.g., hdfs, to create directory and
assign permission.
4:11 AM
To: ""mailto:guxiaobo1...@qq.com>>;
Cc:
"user@spark.apache.org<mailto:user@spark.apache.org>"mailto:user@spark.apache.org>>;
"Cheng Lian"mailto:lian.cs@gmail.com>>;
Subject: Re: Can't access remote Hive table from spark
Yes. You n
Original --
From: "Zhan Zhang";;
Send time: Sunday, Feb 8, 2015 4:11 AM
To: "";
Cc: "user@spark.apache.org"; "Cheng
Lian";
Subject: Re: Can't access remote Hive table from spark
Yes. You need to create xiaobogu under /user and provide r
uot;user@spark.apache.org";
Subject: Re: Can't access remote Hive table from spark
Please note that Spark 1.2.0 only support Hive 0.13.1 or 0.12.0,
none of other versions are supported.
Best,
Cheng
On 1/25/15 1
pache.org<mailto:user@spark.apache.org>"mailto:user@spark.apache.org>>;
"Cheng Lian"mailto:lian.cs@gmail.com>>;
Subject: Re: Can't access remote Hive table from spark
Not sure spark standalone mode. But on spark-on-yarn, it should work. You can
check foll
essorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:606)
>
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(Ret
... 24 more
[xiaobogu@lix1 spark]$
-- Original --
From: "Zhan Zhang";;
Send time: Friday, Feb 6, 2015 2:55 PM
To: "";
Cc: "user@spark.apache.org"; "Cheng
Lian";
Subject: Re: Can't access remote Hive ta
Not sure spark standalone mode. But on spark-on-yarn, it should work. You can
check following link:
http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/
Thanks.
Zhan Zhang
On Feb 5, 2015, at 5:02 PM, Cheng Lian
mailto:lian.cs@gmail.com>> wrote:
Please note that Spark 1.2.0 on
41 AM
> *To:* ""; "user@spark.apache.org"<
> user@spark.apache.org>;
> *Subject: * RE: Can't access remote Hive table from spark
>
> This happened to me as well, putting hive-site.xml inside conf doesn't
> seem to work. In
Please note that Spark 1.2.0 /only/ support Hive 0.13.1 /or/ 0.12.0,
none of other versions are supported.
Best,
Cheng
On 1/25/15 12:18 AM, guxiaobo1982 wrote:
Hi,
I built and started a single node standalone Spark 1.2.0 cluster along
with a single node Hive 0.14.0 instance installed by Amba
uot;Jörn Franke";
Subject: Re: Can't access remote Hive table from spark
I am sorry , i forget to say that I have created the table manually .
在 2015年2月1日,下午4:14,Jörn Franke 写道:
You commented the line which is suppose to create a table.
Le 25 janv. 2015 09:20, "guxiaobo19
Original --
From: "Skanda Prasad";;
Send time: Monday, Jan 26, 2015 7:41 AM
To: ""; "user@spark.apache.org";
Subject: RE: Can't access remote Hive table from spark
This happened to me as well, putting hive-site.xml inside conf doesn
me: Monday, Jan 26, 2015 7:41 AM
To: ""; "user@spark.apache.org";
Subject: RE: Can't access remote Hive table from spark
This happened to me as well, putting hive-site.xml inside conf doesn't seem to
work. Instead I added /etc/hive/conf to SPARK_CLASSPATH a
This happened to me as well, putting hive-site.xml inside conf doesn't seem to
work. Instead I added /etc/hive/conf to SPARK_CLASSPATH and it worked. You can
try this approach.
-Skanda
-Original Message-
From: "guxiaobo1982"
Sent: 25-01-2015 13:50
To: "user@spark.apache.org"
Subjec
15 matches
Mail list logo