+1 Great, I see the list of resolved issues, do you have a list of known issue 
you plan to stay with this release?

with
build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.7.1 -Phive -Phive-thriftserver 
-DskipTests clean package

mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 
2015-11-10T17:41:47+01:00)
Maven home: /Users/omarcu/tools/apache-maven-3.3.9
Java version: 1.7.0_80, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.11.5", arch: "x86_64", family: “mac"

[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [  2.635 s]
[INFO] Spark Project Tags ................................. SUCCESS [  1.896 s]
[INFO] Spark Project Sketch ............................... SUCCESS [  2.560 s]
[INFO] Spark Project Networking ........................... SUCCESS [  6.533 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  4.176 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [  4.809 s]
[INFO] Spark Project Launcher ............................. SUCCESS [  6.242 s]
[INFO] Spark Project Core ................................. SUCCESS [01:20 min]
[INFO] Spark Project GraphX ............................... SUCCESS [  9.148 s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 22.760 s]
[INFO] Spark Project Catalyst ............................. SUCCESS [ 50.783 s]
[INFO] Spark Project SQL .................................. SUCCESS [01:05 min]
[INFO] Spark Project ML Local Library ..................... SUCCESS [  4.281 s]
[INFO] Spark Project ML Library ........................... SUCCESS [ 54.537 s]
[INFO] Spark Project Tools ................................ SUCCESS [  0.747 s]
[INFO] Spark Project Hive ................................. SUCCESS [ 33.032 s]
[INFO] Spark Project HiveContext Compatibility ............ SUCCESS [  3.198 s]
[INFO] Spark Project REPL ................................. SUCCESS [  3.573 s]
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [  4.617 s]
[INFO] Spark Project YARN ................................. SUCCESS [  7.321 s]
[INFO] Spark Project Hive Thrift Server ................... SUCCESS [ 16.496 s]
[INFO] Spark Project Assembly ............................. SUCCESS [  2.300 s]
[INFO] Spark Project External Flume Sink .................. SUCCESS [  4.219 s]
[INFO] Spark Project External Flume ....................... SUCCESS [  6.987 s]
[INFO] Spark Project External Flume Assembly .............. SUCCESS [  1.465 s]
[INFO] Spark Integration for Kafka 0.8 .................... SUCCESS [  6.891 s]
[INFO] Spark Project Examples ............................. SUCCESS [ 13.465 s]
[INFO] Spark Project External Kafka Assembly .............. SUCCESS [  2.815 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 07:04 min
[INFO] Finished at: 2016-05-18T17:55:33+02:00
[INFO] Final Memory: 90M/824M
[INFO] ------------------------------------------------------------------------

> On 18 May 2016, at 16:28, Sean Owen <so...@cloudera.com> wrote:
> 
> I think it's a good idea. Although releases have been preceded before
> by release candidates for developers, it would be good to get a formal
> preview/beta release ratified for public consumption ahead of a new
> major release. Better to have a little more testing in the wild to
> identify problems before 2.0.0 is finalized.
> 
> +1 to the release. License, sigs, etc check out. On Ubuntu 16 + Java
> 8, compilation and tests succeed for "-Pyarn -Phive
> -Phive-thriftserver -Phadoop-2.6".
> 
> On Wed, May 18, 2016 at 6:40 AM, Reynold Xin <r...@apache.org> wrote:
>> Hi,
>> 
>> In the past the Apache Spark community have created preview packages (not
>> official releases) and used those as opportunities to ask community members
>> to test the upcoming versions of Apache Spark. Several people in the Apache
>> community have suggested we conduct votes for these preview packages and
>> turn them into formal releases by the Apache foundation's standard. Preview
>> releases are not meant to be functional, i.e. they can and highly likely
>> will contain critical bugs or documentation errors, but we will be able to
>> post them to the project's website to get wider feedback. They should
>> satisfy the legal requirements of Apache's release policy
>> (http://www.apache.org/dev/release.html) such as having proper licenses.
>> 
>> 
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.0.0-preview. The vote is open until Friday, May 20, 2015 at 11:00 PM PDT
>> and passes if a majority of at least 3 +1 PMC votes are cast.
>> 
>> [ ] +1 Release this package as Apache Spark 2.0.0-preview
>> [ ] -1 Do not release this package because ...
>> 
>> To learn more about Apache Spark, please see http://spark.apache.org/
>> 
>> The tag to be voted on is 2.0.0-preview
>> (8f5a04b6299e3a47aca13cbb40e72344c0114860)
>> 
>> The release files, including signatures, digests, etc. can be found at:
>> http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-bin/
>> 
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>> 
>> The documentation corresponding to this release can be found at:
>> http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-docs/
>> 
>> The list of resolved issues are:
>> https://issues.apache.org/jira/browse/SPARK-15351?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.0.0
>> 
>> 
>> If you are a Spark user, you can help us test this release by taking an
>> existing Apache Spark workload and running on this candidate, then reporting
>> any regressions.
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 

Reply via email to