that is .. hadoop 1.2.1.. no cluster just my local machine. 

Master seems to be building fine today.  

I'm building and testing from Pat's hadoop-client branch now.. using:

  $ mvn clean install package -Dhadoop.version=1.2.1  

With a clean maven repo and SPARK_HOME unset.

> From: [email protected]
> To: [email protected]
> Subject: RE: Jenkins build became unstable: mahout-nightly » Mahout Spark 
> bindings #1728
> Date: Fri, 31 Oct 2014 12:44:49 -0400
> 
> no- adoop 1.2.1
> 
> > Subject: Re: Jenkins build became unstable: mahout-nightly » Mahout Spark 
> > bindings #1728
> > From: [email protected]
> > Date: Fri, 31 Oct 2014 09:41:34 -0700
> > To: [email protected]
> > 
> > Are you on hadoop 2.2?
> > 
> > On Oct 31, 2014, at 9:37 AM, Andrew Palumbo <[email protected]> wrote:
> > 
> > Yes this is odd.. To confuse things further, I cleaned out my local maven 
> > repo again this 
> > morning, and this time built and tested without errors. I'm double 
> > checking this again now.  
> > 
> > 
> > > Subject: Re: Jenkins build became unstable: mahout-nightly » Mahout Spark 
> > > bindings #1728
> > > From: [email protected]
> > > Date: Fri, 31 Oct 2014 09:26:56 -0700
> > > To: [email protected]
> > > 
> > > I think that’s because the Spark in the maven repos is tied to hadoop 2 
> > > and the default in the master is 1.2.1
> > > 
> > > Sounds like you are the closest to the build machines. Can you try 
> > > https://github.com/pferrel/mahout/tree/hadoop-client
> > 
> > 
> > sure I'll try this.
> > 
> > 
> > > 
> > > This is a merge of Gokhan’s patch with master. It should default to 
> > > hadoop 2 and theoretically should have all artifacts in alignment.
> > > 
> > > On Oct 30, 2014, at 8:11 PM, Andrew Palumbo <[email protected]> wrote:
> > > 
> > > 
> > > 
> > >> Subject: Re: Jenkins build became unstable: mahout-nightly » Mahout 
> > >> Spark 
> > > 
> > > I cleaned out my mvn repo, unset SPARK_HOME, and ran 
> > > 
> > > $ mvn clean install 
> > > 
> > > from the latest master. now am getting the failure you're talking about:
> > > 
> > > - ddsvd - naive - q=1 *** FAILED ***
> > > org.apache.spark.SparkException: Job aborted due to stage failure: Task 9 
> > > in stage 28.0 failed 1 times, most recent failure: Lost task 9.0 in stage 
> > > 28.0 (TID 81, localhost): java.io.IOException: PARSING_ERROR(2)
> > >       org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
> > > 
> > > 
> > > 
> > > 
> > > 
> > > bindings #1728
> > >> From: [email protected]
> > >> Date: Thu, 30 Oct 2014 19:10:19 -0700
> > >> To: [email protected]
> > >> 
> > >> I took Gokhan’s PR and merged the master with it and compiling with 
> > >> 
> > >> mvn clean install package -Dhadoop.version=1.2.1
> > >> 
> > >> I get the same build error as the nightly.
> > >> 
> > >> Changing back to the master it builds fine. The default hadoop version 
> > >> is 1.2.1 in master so I don’t need a profile or CLI options to build for 
> > >> my environment.
> > >> 
> > >> This seems like more than cosmic rays as Dmitriy guessed.
> > >> 
> > >> On Oct 30, 2014, at 12:41 PM, Dmitriy Lyubimov <[email protected]> wrote:
> > >> 
> > >> more likely spark thing .
> > >> 
> > >> the error is while using torrent broadcast. AFAIK that was not default
> > >> choice until recently.
> > >> 
> > >> On Thu, Oct 30, 2014 at 10:27 AM, Suneel Marthi <[email protected]> 
> > >> wrote:
> > >> 
> > >>> The nightly builds often due to running on an old machine and the 
> > >>> failure
> > >>> is also a function of the number of concurrent jobs that are running.  
> > >>> If u
> > >>> look at the logs from the failure, it most likely would have failed due 
> > >>> to
> > >>> a JVM crash (or something similar).  Its the daily builds that we need 
> > >>> to
> > >>> ensure are not failing.
> > >>> 
> > >>> 
> > >>> On Thu, Oct 30, 2014 at 1:21 PM, Andrew Palumbo <[email protected]>
> > >>> wrote:
> > >>> 
> > >>>> I just built and tested with no problems.  Probably just Jenkins acting
> > >>> up.
> > >>>> 
> > >>>>> Subject: Re: Jenkins build became unstable:  mahout-nightly » Mahout
> > >>>> Spark bindings #1728
> > >>>>> From: [email protected]
> > >>>>> Date: Thu, 30 Oct 2014 09:26:45 -0700
> > >>>>> To: [email protected]
> > >>>>> 
> > >>>>> At first blush this looks unrelated to the stuff I pushed to move to
> > >>>> Spark 1.1.0
> > >>>>> 
> > >>>>> The error is in snappy parsing during some R-like ops
> > >>>>> 
> > >>>>> I don’t use native snappy myself, is anyone else seeing this or is it
> > >>>> just  cosmic rays?
> > >>>>> 
> > >>>>> 
> > >>>>> On Oct 29, 2014, at 4:43 PM, Apache Jenkins Server <
> > >>>> [email protected]> wrote:
> > >>>>> 
> > >>>>> See <
> > >>>> 
> > >>> https://builds.apache.org/job/mahout-nightly/org.apache.mahout$mahout-spark_2.10/1728/
> > >>>>> 
> > >>>>> 
> > >>>>> 
> > >>>> 
> > >>> 
> > >> 
> > >                                     
> > > 
> >                                       
> > 
>                                         
                                          

Reply via email to