>> And I'm seeing the errors when I build Spark first time after
downloading and extracting
spark-0.9.0-incubating.tgz<http://people.apache.org/~pwendell/spark-0.9.0-incubating-rc5/spark-0.9.0-incubating.tgz>
A little clarification... I see the errors during sbt test after building
Spark, not during build.


On Sun, Jan 26, 2014 at 11:48 PM, Taka Shinagawa <taka.epsi...@gmail.com>wrote:

> I'm always running sbt clean before building Spark. And I'm seeing the
> errors when I build Spark first time after downloading and extracting
> spark-0.9.0-incubating.tgz<http://people.apache.org/~pwendell/spark-0.9.0-incubating-rc5/spark-0.9.0-incubating.tgz>
>
> Just in case, I deleted the test.jar files (in work/app-201401XXXXXX-0000
> directories) which didn't get deleted by sbt clean. But I still see the
> errors.
>
> This is not a blocker for the release at all. I've tested the RC5 and
> don't find any other issue right now.
>
> I'm curious if anyone else sees this.
>
>
>
>
> On Sun, Jan 26, 2014 at 10:58 PM, Patrick Wendell <pwend...@gmail.com>wrote:
>
>> Hey Taka,
>>
>> If you build a second version you need to clean the existing assembly jar.
>>
>> The reference implementation of the tests are the ones on the U.C.
>> Berkeley Jenkins. These are passing for Branch 0.9 for both Hadoop 1
>> and Hadoop 2 versions, so I'm inclined to think it's an issue with
>> your test env or setup.
>>
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/
>>
>> - Patrick
>>
>> On Sun, Jan 26, 2014 at 10:52 PM, Reynold Xin <r...@databricks.com>
>> wrote:
>> > It is possible that you have generated the assembly jar using one
>> version
>> > of Hadoop, and then another assembly jar with another version. Those
>> tests
>> > that failed are all using a local cluster that sets up multiple
>> processes,
>> > which would require launching Spark worker processes using the assembly
>> > jar. If that's indeed the problem, removing the extra assembly jars
>> should
>> > fix them.
>> >
>> >
>> > On Sun, Jan 26, 2014 at 10:49 PM, Taka Shinagawa <
>> taka.epsi...@gmail.com>wrote:
>> >
>> >> If I build Spark for Hadoop 1.0.4 (either "SPARK_HADOOP_VERSION=1.0.4
>> >> sbt/sbt assembly"  or "sbt/sbt assembly") or use the binary
>> distribution,
>> >> 'sbt/sbt test' runs successfully.
>> >>
>> >> However, if I build Spark targeting any other Hadoop versions (e.g.
>> >> "SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly",
>> "SPARK_HADOOP_VERSION=2.2.0
>> >> sbt/sbt assembly"), I'm getting the following errors with 'sbt/sbt
>> test':
>> >>
>> >> 1) type mismatch errors with JavaPairDStream.scala
>> >> 2) following test failures
>> >> [error] Failed tests:
>> >> [error] org.apache.spark.ShuffleNettySuite
>> >> [error] org.apache.spark.ShuffleSuite
>> >> [error] org.apache.spark.FileServerSuite
>> >> [error] org.apache.spark.DistributedSuite
>> >>
>> >> I don't have Hadoop 1.0.4 installed on my test systems (but the test
>> >> succeeds, and failed with the installed Hadoop versions). I'm seeing
>> these
>> >> sbt test errors with the previous 0.9.0 RCs and 0.8.1, too.
>> >>
>> >> I'm wondering if anyone else has seen this problem or I'm missing
>> something
>> >> to run the test correctly.
>> >>
>> >> Thanks,
>> >> Taka
>> >>
>> >>
>> >>
>> >>
>> >> On Sat, Jan 25, 2014 at 5:00 PM, Sean McNamara
>> >> <sean.mcnam...@webtrends.com>wrote:
>> >>
>> >> > +1
>> >> >
>> >> > On 1/25/14, 4:04 PM, "Mark Hamstra" <m...@clearstorydata.com> wrote:
>> >> >
>> >> > >+1
>> >> > >
>> >> > >
>> >> > >On Sat, Jan 25, 2014 at 2:37 PM, Andy Konwinski
>> >> > ><andykonwin...@gmail.com>wrote:
>> >> > >
>> >> > >> +1
>> >> > >>
>> >> > >>
>> >> > >> On Sat, Jan 25, 2014 at 2:27 PM, Reynold Xin <r...@databricks.com
>> >
>> >> > >>wrote:
>> >> > >>
>> >> > >> > +1
>> >> > >> >
>> >> > >> > > On Jan 25, 2014, at 12:07 PM, Hossein <fal...@gmail.com>
>> wrote:
>> >> > >> > >
>> >> > >> > > +1
>> >> > >> > >
>> >> > >> > > Compiled and tested on Mavericks.
>> >> > >> > >
>> >> > >> > > --Hossein
>> >> > >> > >
>> >> > >> > >
>> >> > >> > > On Sat, Jan 25, 2014 at 11:38 AM, Patrick Wendell
>> >> > >><pwend...@gmail.com
>> >> > >> > >wrote:
>> >> > >> > >
>> >> > >> > >> I'll kick of the voting with a +1.
>> >> > >> > >>
>> >> > >> > >> On Thu, Jan 23, 2014 at 11:33 PM, Patrick Wendell
>> >> > >><pwend...@gmail.com
>> >> > >> >
>> >> > >> > >> wrote:
>> >> > >> > >>> Please vote on releasing the following candidate as Apache
>> Spark
>> >> > >> > >>> (incubating) version 0.9.0.
>> >> > >> > >>>
>> >> > >> > >>> A draft of the release notes along with the changes file is
>> >> > >>attached
>> >> > >> > >>> to this e-mail.
>> >> > >> > >>>
>> >> > >> > >>> The tag to be voted on is v0.9.0-incubating (commit
>> 95d28ff3):
>> >> > >> > >>
>> >> > >> >
>> >> > >>
>> >> > >>
>> >> >
>> >>
>> https://git-wip-us.apache.org/repos/asf?p=incubator-spark.git;a=commit;h=
>> >> > >>95d28ff3d0d20d9c583e184f9e2c5ae842d8a4d9
>> >> > >> > >>>
>> >> > >> > >>> The release files, including signatures, digests, etc can be
>> >> found
>> >> > >> at:
>> >> > >> > >>>
>> http://people.apache.org/~pwendell/spark-0.9.0-incubating-rc5
>> >> > >> > >>>
>> >> > >> > >>> Release artifacts are signed with the following key:
>> >> > >> > >>> https://people.apache.org/keys/committer/pwendell.asc
>> >> > >> > >>>
>> >> > >> > >>> The staging repository for this release can be found at:
>> >> > >> > >>>
>> >> > >> >
>> >> > >>
>> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1006/
>> >> > >> > >>>
>> >> > >> > >>> The documentation corresponding to this release can be
>> found at:
>> >> > >> > >>>
>> >> > >>
>> http://people.apache.org/~pwendell/spark-0.9.0-incubating-rc5-docs/
>> >> > >> > >>>
>> >> > >> > >>> Please vote on releasing this package as Apache Spark
>> >> > >> 0.9.0-incubating!
>> >> > >> > >>>
>> >> > >> > >>> The vote is open until Monday, January 27, at 07:30 UTC and
>> >> passes
>> >> > >> ifa
>> >> > >> > >>> majority of at least 3 +1 PPMC votes are cast.
>> >> > >> > >>>
>> >> > >> > >>> [ ] +1 Release this package as Apache Spark 0.9.0-incubating
>> >> > >> > >>> [ ] -1 Do not release this package because ...
>> >> > >> > >>>
>> >> > >> > >>> To learn more about Apache Spark, please see
>> >> > >> > >>> http://spark.incubator.apache.org/
>> >> > >> > >>
>> >> > >> >
>> >> > >> > --
>> >> > >> > You received this message because you are subscribed to the
>> Google
>> >> > >>Groups
>> >> > >> > "Unofficial Apache Spark Dev Mailing List Mirror" group.
>> >> > >> > To unsubscribe from this group and stop receiving emails from
>> it,
>> >> > >>send an
>> >> > >> > email to apache-spark-dev-mirror+unsubscr...@googlegroups.com.
>> >> > >> > For more options, visit
>> https://groups.google.com/groups/opt_out.
>> >> > >> >
>> >> > >>
>> >> >
>> >> >
>> >>
>>
>
>

Reply via email to