-1 (non-binding) as we have a couple live blockers.

I know RC1 is mostly for everyone to audit the release artifacts earlier
before everything is ready, but just wanted to make clear that we expect to
have another RC.

On Wed, Feb 19, 2025 at 7:26 PM Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> +1
>
> Dr Mich Talebzadeh,
> Architect | Data Science | Financial Crime | Forensic Analysis | GDPR
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
>
>
> On Wed, 19 Feb 2025 at 09:31, Wenchen Fan <cloud0...@gmail.com> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 4.0.0.
>>
>> The vote is open until February 21 (PST) and passes if a majority +1 PMC
>> votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 4.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v4.0.0-rc1 (commit
>> 860bd5e93d659852c0e80a82d4494d0e9548fde5)
>> https://github.com/apache/spark/tree/v4.0.0-rc1
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc1-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1475/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc1-docs/
>>
>> The list of bug fixes going into 4.0.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12353359
>>
>> This release is using the release script of the tag v4.0.0-rc1.
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>

Reply via email to