Passed all the tests, looks good.
On 2/23/18 15:00, Holden Karau wrote:
PySpark artifacts install in a fresh Py3 virtual env
On Feb 23, 2018 7:55 AM, "Denny Lee" <denny.g....@gmail.com
On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough
New to testing out Spark RCs for the community but I was able
to run some of the basic unit tests without error so for what
it's worth, I'm a +1.
On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal
<samee...@apache.org <mailto:samee...@apache.org>> wrote:
Please vote on releasing the following candidate as Apache
Spark version 2.3.0. The vote is open until Tuesday
February 27, 2018 at 8:00:00 am UTC and passes if a
majority of at least 3 PMC +1 votes are cast.
[ ] +1 Release this package as Apache Spark 2.3.0
[ ] -1 Do not release this package because ...
To learn more about Apache Spark, please see
The tag to be voted on is v2.3.0-rc5:
List of JIRA tickets resolved in this release can be found
The release files, including signatures, digests, etc. can
be found at:
Release artifacts are signed with the following key:
The staging repository for this release can be found at:
The documentation corresponding to this release can be
What are the unresolved issues targeted for 2.3.0?
Please see https://s.apache.org/oXKi. At the time of
writing, there are currently no known release blockers.
How can I help test this release?
If you are a Spark user, you can help us test this release
by taking an existing Spark workload and running on this
release candidate, then reporting any regressions.
If you're working in PySpark you can set up a virtual env
and install the current RC and see if anything important
breaks, in the Java/Scala you can add the staging
repository to your projects resolvers and test with the RC
(make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going
What should happen to JIRA tickets still targeting 2.3.0?
Committers should look at those and triage. Extremely
important bug fixes, documentation, and API tweaks that
impact compatibility should be worked on immediately.
Everything else please retarget to 2.3.1 or 2.4.0 as
Why is my bug not fixed?
In order to make timely releases, we will typically not
hold the release unless the bug in question is a
regression from 2.2.0. That being said, if there is
something which is a regression from 2.2.0 and has not
been correctly targeted please ping me or a committer to
help target the issue (you can see the open issues listed
as impacting Spark 2.3.0 at https://s.apache.org/WmoI).