Re: spark-1.6.1-bin-without-hadoop can not use spark-sql

2016-06-22 Thread Ted Yu
http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz> which > is a pre-built package on hadoop 2.7.2? > > > > -- 原始邮件 -- > *发件人:* "Ted Yu";; > *发送时间:* 2016年6月22日(星期三) 晚上11:51 > *收件人:* "喜之郎"<251922...@qq.com>; > *

?????? spark-1.6.1-bin-without-hadoop can not use spark-sql

2016-06-22 Thread ??????
.@qq.com>; : "user"; : Re: spark-1.6.1-bin-without-hadoop can not use spark-sql build/mvn clean -Phive -Phive-thriftserver -Pyarn -Phadoop-2.6 -Psparkr -Dhadoop.version=2.7.2 package On Wed, Jun 22, 2016 at 8:00 AM, 251922566 <251922...@qq.com> wrote: ok,i will re

Re: spark-1.6.1-bin-without-hadoop can not use spark-sql

2016-06-22 Thread Ted Yu
hat on param --hadoop, 2.7.2 or others? > > 来自我的华为手机 > > > ---- 原始邮件 -------- > 主题:Re: spark-1.6.1-bin-without-hadoop can not use spark-sql > 发件人:Ted Yu > 收件人:喜之郎 <251922...@qq.com> > 抄送:user > > > I wonder if the tar ball was built with: > > -Phive -Phi

回复: spark-1.6.1-bin-without-hadoop can not use spark-sql

2016-06-22 Thread 251922566
ok,i will rebuild myself. if i want to use spark with hadoop 2.7.2, when i build spark, i should put what on param --hadoop, 2.7.2 or others?来自我的华为手机 原始邮件 主题:Re: spark-1.6.1-bin-without-hadoop can not use spark-sql发件人:Ted Yu 收件人:喜之郎 <251922...@qq.com>抄送:user I wonder if t

Re: spark-1.6.1-bin-without-hadoop can not use spark-sql

2016-06-22 Thread Ted Yu
I wonder if the tar ball was built with: -Phive -Phive-thriftserver Maybe rebuild by yourself with the above ? FYI On Wed, Jun 22, 2016 at 4:38 AM, 喜之郎 <251922...@qq.com> wrote: > Hi all. > I download spark-1.6.1-bin-without-hadoop.tgz >

spark-1.6.1-bin-without-hadoop can not use spark-sql

2016-06-22 Thread ??????
Hi all. I download spark-1.6.1-bin-without-hadoop.tgz from website. And I configured "SPARK_DIST_CLASSPATH" in spark-env.sh. Now spark-shell run well. But spark-sql can not run. My hadoop version is 2.7.2. This is error infos: bin/spark-sql java.lang.ClassNotFoundException: org.apache.spark.sql