Compilation on master branch has been fixed.

Thanks to Cheng Lian.

On Thu, Jul 9, 2015 at 8:50 AM, Josh Rosen <rosenvi...@gmail.com> wrote:

> Jenkins runs compile-only builds for Maven as an early warning system for
> this type of issue; you can see from
> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/ that the
> Maven compilation is now broken in master.
>
> On Thu, Jul 9, 2015 at 8:48 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> I guess the compilation issue didn't surface in QA run because sbt was
>> used:
>>
>> [info] Building Spark (w/Hive 0.13.1) using SBT with these arguments:  
>> -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Pkinesis-asl -Phive-thriftserver 
>> -Phive package assembly/assembly streaming-kafka-assembly/assembly 
>> streaming-flume-assembly/assembly
>>
>>
>> Cheers
>>
>>
>> On Thu, Jul 9, 2015 at 7:58 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> From
>>> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-Maven-with-YARN/HADOOP_PROFILE=hadoop-2.4,label=centos/2875/consoleFull
>>> :
>>>
>>> + build/mvn -DzincPort=3439 -DskipTests -Phadoop-2.4 -Pyarn -Phive 
>>> -Phive-thriftserver -Pkinesis-asl clean package
>>>
>>>
>>> FYI
>>>
>>>
>>> On Thu, Jul 9, 2015 at 7:51 AM, Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> This is an error from scalac and not Spark. I find it happens
>>>> frequently for me but goes away on a clean build. *shrug*
>>>>
>>>>
>>>> On Thu, Jul 9, 2015 at 3:45 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>> > Looking at
>>>> >
>>>> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-Maven-with-YARN/HADOOP_PROFILE=hadoop-2.4,label=centos/2875/consoleFull
>>>> > :
>>>> >
>>>> > [error]
>>>> > [error]      while compiling:
>>>> >
>>>> /home/jenkins/workspace/Spark-Master-Maven-with-YARN/HADOOP_PROFILE/hadoop-2.4/label/centos/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala
>>>> > [error]         during phase: typer
>>>> > [error]      library version: version 2.10.4
>>>> > [error]     compiler version: version 2.10.4
>>>> >
>>>> >
>>>> > I traced back to build #2869 and the error was there - didn't go back
>>>> > further.
>>>> >
>>>> >
>>>> > FYI
>>>> >
>>>> >
>>>> > On Thu, Jul 9, 2015 at 7:24 AM, Yijie Shen <henry.yijies...@gmail.com
>>>> >
>>>> > wrote:
>>>> >>
>>>> >> Hi,
>>>> >>
>>>> >> I use the clean version just clone from the master branch, build
>>>> with:
>>>> >>
>>>> >> build/mvn -Phive -Phadoop-2.4 -DskipTests package
>>>> >>
>>>> >> And BUILD FAILURE at last, due to:
>>>> >>
>>>> >> [error]      while compiling:
>>>> >>
>>>> /Users/yijie/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala
>>>> >> [error]         during phase: typer
>>>> >> [error]      library version: version 2.10.4
>>>> >> [error]     compiler version: version 2.10.4
>>>> >> ...
>>>> >> [error]
>>>> >> [error]   last tree to typer: Ident(Warehouse)
>>>> >> [error]               symbol: <none> (flags: )
>>>> >> [error]    symbol definition: <none>
>>>> >> [error]        symbol owners:
>>>> >> [error]       context owners: lazy value hiveWarehouse -> class
>>>> >> HiveMetastoreCatalog -> package hive
>>>> >> [error]
>>>> >> [error] == Enclosing template or block ==
>>>> >> [error]
>>>> >> [error] Template( // val <local HiveMetastoreCatalog>: <notype> in
>>>> class
>>>> >> HiveMetastoreCatalog
>>>> >> [error]   "Catalog", "Logging" // parents
>>>> >> [error]   ValDef(
>>>> >> [error]     private
>>>> >> [error]     "_"
>>>> >> [error]     <tpt>
>>>> >> [error]     <empty>
>>>> >> [error]   )
>>>> >> [error]   // 24 statements
>>>> >> [error]   ValDef( // private[this] val client:
>>>> >> org.apache.spark.sql.hive.client.ClientInterface in class
>>>> >> HiveMetastoreCatalog
>>>> >> [error]     private <local> <paramaccessor>
>>>> >> [error]     "client"
>>>> >> [error]     "ClientInterface"
>>>> >> [error]     <empty>
>>>> >> …
>>>> >>
>>>> >>
>>>> https://gist.github.com/yijieshen/e0925e2227a312ae4c64#file-build_failure
>>>> >>
>>>> >> Did I make a silly mistake?
>>>> >>
>>>> >> Thanks, Yijie
>>>> >
>>>> >
>>>>
>>>
>>>
>>
>

Reply via email to