Re: [VOTE] Spark 2.2.2 (RC2)

2018-06-27 Thread zhenya Sun
+1
> 在 2018年6月28日,上午10:15,Hyukjin Kwon  写道:
> 
> +1
> 
> 2018년 6월 28일 (목) 오전 8:42, Sean Owen  >님이 작성:
> +1 from me too.
> 
> On Wed, Jun 27, 2018 at 3:31 PM Tom Graves  
> wrote:
> Please vote on releasing the following candidate as Apache Spark version 
> 2.2.2.
> 
> The vote is open until Mon, July 2nd @ 9PM UTC (2PM PDT) and passes if a
> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> 
> [ ] +1 Release this package as Apache Spark 2.2.2
> [ ] -1 Do not release this package because ...
> 
> To learn more about Apache Spark, please see http://spark.apache.org/ 
> 
> 
> The tag to be voted on is v2.2.2-rc2 (commit 
> fc28ba3db7185e84b6dbd02ad8ef8f1d06b9e3c6):
> https://github.com/apache/spark/tree/v2.2.2-rc2 
> 
> 
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.2.2-rc2-bin/ 
> 
> 
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS 
> 
> 
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1276/ 
> 
> 
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.2.2-rc2-docs/ 
> 
> 
> The list of bug fixes going into 2.2.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12342171 
> 
> 
> 
> Notes:
> 
> - RC1 was not sent for a vote. I had trouble building it, and by the time I 
> got
>   things fixed, there was a blocker bug filed. It was already tagged in git
>   at that time.
> 
> 
> FAQ
> 
> =
> How can I help test this release?
> =
> 
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
> 
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
> 
> ===
> What should happen to JIRA tickets still targeting 2.2.2?
> ===
> 
> The current list of open tickets targeted at 2.2.2 can be found at:
> https://issues.apache.org/jira/projects/SPARK 
>  and search for "Target 
> Version/s" = 2.2.2
> 
> 
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
> 
> ==
> But my bug isn't fixed?
> ==
> 
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
> 
> 
> -- 
> Tom Graves



Re: Welcome Zhenhua Wang as a Spark committer

2018-04-01 Thread zhenya Sun
congratulations

> 在 2018年4月2日,下午1:30,Hyukjin Kwon  写道:
> 
> Congratuation, Zhenhua Wang! Very well deserved.
> 
> 2018-04-02 13:28 GMT+08:00 Wenchen Fan  >:
> Hi all,
> 
> The Spark PMC recently added Zhenhua Wang as a committer on the project. 
> Zhenhua is the major contributor of the CBO project, and has been 
> contributing across several areas of Spark for a while, focusing especially 
> on analyzer, optimizer in Spark SQL. Please join me in welcoming Zhenhua!
> 
> Wenchen
>