+1 (non-binding) Vlad
On May 19, 2025, at 8:56 PM, Jules Damji <jules.da...@gmail.com> wrote: + 1 (non-binding) — Sent from my iPhone Pardon the dumb thumb typos :) On May 19, 2025, at 5:26 PM, Gengliang Wang <ltn...@gmail.com> wrote: +1 On Mon, May 19, 2025 at 5:21 PM Jungtaek Lim <kabhwan.opensou...@gmail.com<mailto:kabhwan.opensou...@gmail.com>> wrote: +1 (non-binding) On Tue, May 20, 2025 at 8:47 AM Ruifeng Zheng <ruife...@apache.org<mailto:ruife...@apache.org>> wrote: +1 On Tue, May 20, 2025 at 7:04 AM Hyukjin Kwon <gurwls...@apache.org<mailto:gurwls...@apache.org>> wrote: +1 On Mon, 19 May 2025 at 21:27, Wenchen Fan <cloud0...@gmail.com<mailto:cloud0...@gmail.com>> wrote: Same as before, I'll start with my own +1. On Mon, May 19, 2025 at 8:25 PM Wenchen Fan <cloud0...@gmail.com<mailto:cloud0...@gmail.com>> wrote: Please vote on releasing the following candidate as Apache Spark version 4.0.0. The vote is open until May 22 (PST) and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes. [ ] +1 Release this package as Apache Spark 4.0.0 [ ] -1 Do not release this package because ... To learn more about Apache Spark, please see https://spark.apache.org/ The tag to be voted on is v4.0.0-rc7 (commit fa33ea000a0bda9e5a3fa1af98e8e85b8cc5e4d4) https://github.com/apache/spark/tree/v4.0.0-rc7 The release files, including signatures, digests, etc. can be found at: https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc7-bin/ Signatures used for Spark RCs can be found in this file: https://dist.apache.org/repos/dist/dev/spark/KEYS The staging repository for this release can be found at: https://repository.apache.org/content/repositories/orgapachespark-1485/ The documentation corresponding to this release can be found at: https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc7-docs/ The list of bug fixes going into 4.0.0 can be found at the following URL: https://issues.apache.org/jira/projects/SPARK/versions/12353359 This release is using the release script of the tag v4.0.0-rc7. FAQ ========================= How can I help test this release? ========================= If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions. If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).