Yes, I saw that as well.  Thanks for asking for clarification, as I was
unclear about that too.  I'll just wait for you to settle it there, but
for now I've applied his patch locally so that I can get my build working.

Thanks,
Jonathan Kelly
Elastic MapReduce - SDE
Port 99 (SEA35) 08.220.C2




On 3/5/15, 5:22 PM, "Konstantin Boudnik" <[email protected]> wrote:

>I have left the comment on the ticket wrt if we should completely revert
>1716
>or just partially as 1727 is suggesting. Let's settle it on the JIRA, if
>possible?
>
>Cos
>
>On Fri, Mar 06, 2015 at 01:13AM, Kelly, Jonathan wrote:
>>    Yay, this does fix the issue.  I'm new enough to Scala that it
>>doesn't
>>    make any sense to me how this change affected the build in this way,
>>but
>>    ok.  :)
>> 
>>    Jonathan Kelly
>> 
>>    Elastic MapReduce - SDE
>> 
>>    Port 99 (SEA35) 08.220.C2
>> 
>>    From: <Kelly>, Jonathan Kelly <[email protected]>
>>    Date: Thursday, March 5, 2015 at 5:06 PM
>>    To: "[email protected]" <[email protected]>
>>    Subject: Re: Spark v1.2.1 failing under BigTop build in External
>>Flume
>>    Sink (due to missing Netty library)
>>    Yeah, I saw all of the JIRA comments and the patch.  You guys are
>>    ridiculously quick!  I'm testing it out myself locally now.
>> 
>>    Thanks a lot,
>> 
>>    Jonathan Kelly
>> 
>>    Elastic MapReduce - SDE
>> 
>>    Port 99 (SEA35) 08.220.C2
>> 
>>    From: +-e?u?`i <[email protected]>
>>    Reply-To: "[email protected]" <[email protected]>
>>    Date: Friday, March 6, 2015 at 3:00 AM
>>    To: "[email protected]" <[email protected]>
>>    Subject: Re: FW: Spark v1.2.1 failing under BigTop build in External
>>Flume
>>    Sink (due to missing Netty library)
>>    Filed https://issues.apache.org/jira/browse/BIGTOP-1727 and uploaded
>>a
>>    patch.
>>    Thanks,
>>    Youngwoo
>>    On Fri, Mar 6, 2015 at 9:38 AM, +-e?u?`i <[email protected]> wrote:
>> 
>>      Gotcha.
>>      BIGTOP-1716 causes the build failure. I'm looking into this.
>>      Sorry for your inconvenience, I'll upload a patch.
>>      Thanks,
>>      Youngwoo
>>      On Fri, Mar 6, 2015 at 8:05 AM, jay vyas
>><[email protected]>
>>      wrote:
>> 
>>        Hi jonathan !
>> 
>>        I did indeed build and test spark 1.2.1 in BIGTOP-1648 : And
>>actually
>>        during the review i pasted the text output : Seemed to work
>>nicely :
>>        https://issues.apache.org/jira/browse/BIGTOP-1648
>>        Lets follow up on this here
>>        https://issues.apache.org/jira/browse/BIGTOP-1726, where we can
>>retest
>>        everything.  Its quite easy to retest will leave some guidance
>>        directions there if you want to try it out.
>> 
>>        On Thu, Mar 5, 2015 at 5:04 PM, Kelly, Jonathan
>><[email protected]>
>>        wrote:
>> 
>>          As I said below, I don't think this could be a BigTop issue,
>>but has
>>          anybody from the BigTop community seen anything like this?
>> 
>>          Thanks,
>>          Jonathan Kelly
>> 
>>          On 3/5/15, 1:34 PM, "Kelly, Jonathan" <[email protected]>
>>wrote:
>> 
>>          >That's probably a good thing to have, so I'll add it, but
>>          unfortunately it
>>          >did not help this issue.  It looks like the hadoop-2.4
>>profile only
>>          sets
>>          >these properties, which don't seem like they would affect
>>anything
>>          related
>>          >to Netty:
>>          >
>>          >      <properties>
>>          >        <hadoop.version>2.4.0</hadoop.version>
>>          >        <protobuf.version>2.5.0</protobuf.version>
>>          >        <jets3t.version>0.9.0</jets3t.version>
>>          >        <commons.math3.version>3.1.1</commons.math3.version>
>>          >      
>><avro.mapred.classifier>hadoop2</avro.mapred.classifier>
>>          >      </properties>
>>          >
>>          >
>>          >Thanks,
>>          >Jonathan Kelly
>>          >
>>          >
>>          >
>>          >
>>          >On 3/5/15, 1:09 PM, "Patrick Wendell" <[email protected]>
>>wrote:
>>          >
>>          >>You may need to add the -Phadoop-2.4 profile. When building
>>or
>>          release
>>          >>packages for Hadoop 2.4 we use the following flags:
>>          >>
>>          >>-Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
>>          >>
>>          >>- Patrick
>>          >>
>>          >>On Thu, Mar 5, 2015 at 12:47 PM, Kelly, Jonathan
>>          <[email protected]>
>>          >>wrote:
>>          >>> I confirmed that this has nothing to do with BigTop by
>>running
>>          the same
>>          >>>mvn
>>          >>> command directly in a fresh clone of the Spark package at
>>the
>>          v1.2.1
>>          >>>tag.  I
>>          >>> got the same exact error.
>>          >>>
>>          >>>
>>          >>>~ Jonathan Kelly
>>          >>>
>>          >>>
>>          >>> From: <Kelly>, Jonathan Kelly <[email protected]>
>>          >>> Date: Thursday, March 5, 2015 at 10:39 AM
>>          >>> To: "[email protected]" <[email protected]>
>>          >>> Subject: Spark v1.2.1 failing under BigTop build in
>>External
>>          Flume Sink
>>          >>>(due
>>          >>> to missing Netty library)
>>          >>>
>>          >>> I'm running into an issue building Spark v1.2.1 (as well
>>as the
>>          latest
>>          >>>in
>>          >>> branch-1.2 and v1.3.0-rc2 and the latest in branch-1.3)
>>with
>>          BigTop
>>          >>>(v0.9,
>>          >>> which is not quite released yet).  The build fails in the
>>          External
>>          >>>Flume
>>          >>> Sink subproject with the following error:
>>          >>>
>>          >>> [INFO] Compiling 5 Scala sources and 3 Java sources to
>>          >>>
>>          
>>>>>/workspace/workspace/bigtop.spark-rpm/build/spark/rpm/BUILD/spark-1.3.
>>>>>0/
>>          >>>e
>>          >>>xternal/flume-sink/target/scala-2.10/classes...
>>          >>> [WARNING] Class org.jboss.netty.channel.ChannelFactory not
>>found
>>          -
>>          >>> continuing with a stub.
>>          >>> [ERROR] error while loading NettyServer, class file
>>          >>>
>>          
>>>>>'/home/ec2-user/.m2/repository/org/apache/avro/avro-ipc/1.7.6/avro-ipc
>>>>>-1
>>          >>>.
>>          >>>7.6.jar(org/apache/avro/ipc/NettyServer.class)'
>>          >>> is broken
>>          >>> (class java.lang.NullPointerException/null)
>>          >>> [WARNING] one warning found
>>          >>> [ERROR] one error found
>>          >>>
>>          >>> It seems like what is happening is that the Netty library
>>is
>>          missing at
>>          >>> build time, which happens because it is explicitly
>>excluded in
>>          the
>>          >>>pom.xml
>>          >>> (see
>>          >>>
>>          
>>>>>https://github.com/apache/spark/blob/v1.2.1/external/flume-sink/pom.xm
>>>>>l#
>>          >>>L
>>          >>>42).
>>          >>> I attempted removing the exclusions and the explicit
>>re-add for
>>          the
>>          >>>test
>>          >>> scope on lines 77-88, and that allowed the build to
>>succeed,
>>          though I
>>          >>>don't
>>          >>> know if that will cause problems at runtime.  I don't have
>>any
>>          >>>experience
>>          >>> with the Flume Sink, so I don't really know how to test
>>it. 
>>          (And, to
>>          >>>be
>>          >>> clear, I'm not necessarily trying to get the Flume Sink to
>>          work-- I
>>          >>>just
>>          >>> want the project to build successfully, though of course
>>I'd
>>          still want
>>          >>>the
>>          >>> Flume Sink to work for whomever does need it.)
>>          >>>
>>          >>> Does anybody have any idea what's going on here?  Here is
>>the
>>          command
>>          >>>BigTop
>>          >>> is running to build Spark:
>>          >>>
>>          >>> mvn -Pbigtop-dist -Pyarn -Phive -Phive-thriftserver
>>          -Pkinesis-asl
>>          >>> -Divy.home=/home/ec2-user/.ivy2
>>          -Dsbt.ivy.home=/home/ec2-user/.ivy2
>>          >>> -Duser.home=/home/ec2-user -Drepo.maven.org=
>>          >>> -Dreactor.repo=file:///home/ec2-user/.m2/repository
>>          >>> -Dhadoop.version=2.4.0-amzn-3-SNAPSHOT
>>          >>>-Dyarn.version=2.4.0-amzn-3-SNAPSHOT
>>          >>> -Dprotobuf.version=2.5.0 -Dscala.version=2.10.3
>>          >>>-Dscala.binary.version=2.10
>>          >>> -DskipTests -DrecompileMode=all install
>>          >>>
>>          >>> As I mentioned above, if I switch to the latest in
>>branch-1.2,
>>          to
>>          >>> v1.3.0-rc2, or to the latest in branch-1.3, I get the same
>>exact
>>          error.
>>          >>> I
>>          >>> was not getting the error with Spark v1.1.0, though there
>>          weren't any
>>          >>> changes to the external/flume-sink/pom.xml between v1.1.0
>>and
>>          v1.2.1.
>>          >>>
>>          >>>
>>          >>> ~ Jonathan Kelly
>>          >
>> 
>>        --
>>        jay vyas

Reply via email to