+1 2026年2月3日(火) 20:18 Peter Toth <[email protected]>:
> +1 (non-binding) > > On Mon, Feb 2, 2026 at 1:14 PM Dongjoon Hyun <[email protected]> wrote: > >> +1 >> >> Dongjoon >> >> PS. Thank you, Wenchen. >> >> On Mon, Feb 2, 2026 at 02:52 Wenchen Fan <[email protected]> wrote: >> >>> +1. >>> >>> The Spark R doc issue is fixed. >>> >>> On Mon, Feb 2, 2026 at 6:37 PM <[email protected]> wrote: >>> >>>> Please vote on releasing the following candidate as Apache Spark >>>> version 4.0.2. >>>> >>>> The vote is open until Thu, 05 Feb 2026 03:36:18 PST and passes if a >>>> majority +1 PMC votes are cast, with >>>> a minimum of 3 +1 votes. >>>> >>>> [ ] +1 Release this package as Apache Spark 4.0.2 >>>> [ ] -1 Do not release this package because ... >>>> >>>> To learn more about Apache Spark, please see https://spark.apache.org/ >>>> >>>> The tag to be voted on is v4.0.2-rc1 (commit 7cc3b9bcdaa): >>>> https://github.com/apache/spark/tree/v4.0.2-rc1 >>>> >>>> The release files, including signatures, digests, etc. can be found at: >>>> https://dist.apache.org/repos/dist/dev/spark/v4.0.2-rc1-bin/ >>>> >>>> Signatures used for Spark RCs can be found in this file: >>>> https://downloads.apache.org/spark/KEYS >>>> >>>> The staging repository for this release can be found at: >>>> https://repository.apache.org/content/repositories/orgapachespark-1514/ >>>> >>>> The documentation corresponding to this release can be found at: >>>> https://dist.apache.org/repos/dist/dev/spark/v4.0.2-rc1-docs/ >>>> >>>> The list of bug fixes going into 4.0.2 can be found at the following >>>> URL: >>>> https://issues.apache.org/jira/projects/SPARK/versions/12356246 >>>> >>>> FAQ >>>> >>>> ========================= >>>> How can I help test this release? >>>> ========================= >>>> >>>> If you are a Spark user, you can help us test this release by taking >>>> an existing Spark workload and running on this release candidate, then >>>> reporting any regressions. >>>> >>>> If you're working in PySpark you can set up a virtual env and install >>>> the current RC via "pip install >>>> https://dist.apache.org/repos/dist/dev/spark/v4.0.2-rc1-bin/pyspark-4.0.2.tar.gz >>>> " >>>> and see if anything important breaks. >>>> In the Java/Scala, you can add the staging repository to your project's >>>> resolvers and test >>>> with the RC (make sure to clean up the artifact cache before/after so >>>> you don't end up building with an out of date RC going forward). >>>> >>>> --------------------------------------------------------------------- >>>> To unsubscribe e-mail: [email protected] >>>> >>>>
