Re: Spark 1.0.0 Standalone AppClient cannot connect Master

2014-06-12 Thread Hao Wang
Hi, Andrew

Got it, Thanks!

Hao

Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.s...@gmail.com


On Fri, Jun 13, 2014 at 12:42 AM, Andrew Or  wrote:

> Hi Wang Hao,
>
> This is not removed. We moved it here:
> http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
> If you're building with SBT, and you don't specify the
> SPARK_HADOOP_VERSION, then it defaults to 1.0.4.
>
> Andrew
>
>
> 2014-06-12 6:24 GMT-07:00 Hao Wang :
>
> Hi, all
>>
>> Why does the Spark 1.0.0 official doc remove how to build Spark with
>> corresponding Hadoop version?
>>
>> It means that if I don't need to specify the Hadoop version with I build
>> my Spark 1.0.0 with `sbt/sbt assembly`?
>>
>>
>> Regards,
>> Wang Hao(王灏)
>>
>> CloudTeam | School of Software Engineering
>> Shanghai Jiao Tong University
>> Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
>> Email:wh.s...@gmail.com
>>
>
>


Re: Spark 1.0.0 Standalone AppClient cannot connect Master

2014-06-12 Thread Andrew Or
Hi Wang Hao,

This is not removed. We moved it here:
http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
If you're building with SBT, and you don't specify the
SPARK_HADOOP_VERSION, then it defaults to 1.0.4.

Andrew


2014-06-12 6:24 GMT-07:00 Hao Wang :

> Hi, all
>
> Why does the Spark 1.0.0 official doc remove how to build Spark with
> corresponding Hadoop version?
>
> It means that if I don't need to specify the Hadoop version with I build
> my Spark 1.0.0 with `sbt/sbt assembly`?
>
>
> Regards,
> Wang Hao(王灏)
>
> CloudTeam | School of Software Engineering
> Shanghai Jiao Tong University
> Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
> Email:wh.s...@gmail.com
>


Spark 1.0.0 Standalone AppClient cannot connect Master

2014-06-12 Thread Hao Wang
Hi, all

Why does the Spark 1.0.0 official doc remove how to build Spark with
corresponding Hadoop version?

It means that if I don't need to specify the Hadoop version with I build my
Spark 1.0.0 with `sbt/sbt assembly`?


Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.s...@gmail.com