Are you using 0.8.1? It will build with protobuf 2.5 instead of 2.4 as long as
you make it depend on Hadoop 2.2. But make sure you build it with
SPARK_HADOOP_VERSION=2.2.0 or whatever.
Spark 0.8.0 doesn’t support Hadoop 2.2 due to this issue.
Matei
On Dec 15, 2013, at 10:25 PM, Azuryy Yu wrot
Hi Matei,
Thanks for your response. I am using 0.8.1, and yes, It was using
protobuf-2.5. Sorry I made a mistake before this email.
I used -Phadoop2-yarn, so I don't find it using pb-2.5, actually, I should
use -Pnew-yarn.
Thank you Matei.
On Mon, Dec 16, 2013 at 4:22 PM, Matei Zaharia wrote:
Thanks Evan, I tried it and the new SBT direct import seems to work well,
though I did run into issues with some yarn imports on Spark.
n
On Thu, Dec 12, 2013 at 7:03 PM, Evan Chan wrote:
> Nick, have you tried using the latest Scala plug-in, which features native
> SBT project imports? ie y
Great job everyone! A big step forward.
On Sat, Dec 14, 2013 at 2:37 AM, andy.petre...@gmail.com <
andy.petre...@gmail.com> wrote:
> That's a very good news!
> Congrats
>
> Envoyé depuis mon HTC
>
> - Reply message -
> De : "Sam Bessalah"
> Pour : "dev@spark.incubator.apache.org"
> Ob
Any news regarding this setting? Is this expected behaviour? Is there some
other way I can have Spark fail-fast?
Thanks!
On Mon, Dec 9, 2013 at 4:35 PM, Grega Kešpret wrote:
> Hi!
>
> I tried this (by setting spark.task.maxFailures to 1) and it still does
> not fail-fast. I started a job and af
I just merged your pull request
https://github.com/apache/incubator-spark/pull/245
On Mon, Dec 16, 2013 at 2:12 PM, Grega Kešpret wrote:
> Any news regarding this setting? Is this expected behaviour? Is there some
> other way I can have Spark fail-fast?
>
> Thanks!
>
> On Mon, Dec 9, 2013 at 4:
This was a good fix - thanks for the contribution.
On Mon, Dec 16, 2013 at 2:16 PM, Reynold Xin wrote:
> I just merged your pull request
> https://github.com/apache/incubator-spark/pull/245
>
>
> On Mon, Dec 16, 2013 at 2:12 PM, Grega Kešpret wrote:
>
>> Any news regarding this setting? Is this
i guess it should really be "maximum number of total task run attempts".
At least that's what it looks logically. in that sense, the rest of the
documentation is correct ( should be at least 1; 1 = task is allowed no
retries (1-1=0)).
On Fri, Nov 29, 2013 at 2:02 AM, Grega Kešpret wrote:
> L