+1 from me. All the sigs and licenses and hashes check out. It builds and passes tests with -Phadoop-2.7 -Pyarn -Phive -Phive-thriftserver on Ubuntu 16 + Java 8.
Here are the open issues for 2.0.2 BTW. No blockers, but some marked Critical FYI. Just making sure nobody really meant for one of these to definitely be in 2.0.2. SPARK-14387 Enable Hive-1.x ORC compatibility with spark.sql.hive.convertMetastoreOrc SPARK-17822 JVMObjectTracker.objMap may leak JVM objects SPARK-17823 Make JVMObjectTracker.objMap thread-safe SPARK-17957 Calling outer join and na.fill(0) and then inner join will miss rows SPARK-17972 Query planning slows down dramatically for large query plans even when sub-trees are cached SPARK-17981 Incorrectly Set Nullability to False in FilterExec SPARK-17982 Spark 2.0.0 CREATE VIEW statement fails :: java.lang.RuntimeException: Failed to analyze the canonicalized SQL. It is possible there is a bug in Spark. On Thu, Oct 27, 2016 at 9:46 AM Herman van Hövell tot Westerflier < hvanhov...@databricks.com> wrote: > +1 > > On Thu, Oct 27, 2016 at 9:18 AM, Reynold Xin <r...@databricks.com> wrote: > > Greetings from Spark Summit Europe at Brussels. > > Please vote on releasing the following candidate as Apache Spark version > 2.0.2. The vote is open until Sun, Oct 30, 2016 at 00:30 PDT and passes if > a majority of at least 3+1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 2.0.2 > [ ] -1 Do not release this package because ... > > > The tag to be voted on is v2.0.2-rc1 > (1c2908eeb8890fdc91413a3f5bad2bb3d114db6c) > > This release candidate resolves 75 issues: > https://s.apache.org/spark-2.0.2-jira > > The release files, including signatures, digests, etc. can be found at: > http://people.apache.org/~pwendell/spark-releases/spark-2.0.2-rc1-bin/ > > Release artifacts are signed with the following key: > https://people.apache.org/keys/committer/pwendell.asc > > The staging repository for this release can be found at: > https://repository.apache.org/content/repositories/orgapachespark-1208/ > > The documentation corresponding to this release can be found at: > http://people.apache.org/~pwendell/spark-releases/spark-2.0.2-rc1-docs/ > > > Q: How can I help test this release? > A: If you are a Spark user, you can help us test this release by taking an > existing Spark workload and running on this release candidate, then > reporting any regressions from 2.0.1. > > Q: What justifies a -1 vote for this release? > A: This is a maintenance release in the 2.0.x series. Bugs already present > in 2.0.1, missing features, or bugs related to new features will not > necessarily block this release. > > Q: What fix version should I use for patches merging into branch-2.0 from > now on? > A: Please mark the fix version as 2.0.3, rather than 2.0.2. If a new RC > (i.e. RC2) is cut, I will change the fix version of those patches to 2.0.2. > > > >