+1 2026年1月12日(月) 16:22 Kent Yao <[email protected]>:
> +1 > > Kent > > Dongjoon Hyun <[email protected]> 于2026年1月12日周一 15:13写道: > >> +1 >> >> Dongjoon. >> >> On 2026/01/12 07:09:52 Hyukjin Kwon wrote: >> > +1! >> > >> > On Mon, 12 Jan 2026 at 16:03, <[email protected]> wrote: >> > >> > > Please vote on releasing the following candidate as Apache Spark >> version >> > > 3.5.8. >> > > >> > > The vote is open until Thu, 15 Jan 2026 00:02:28 PST and passes if a >> > > majority +1 PMC votes are cast, with >> > > a minimum of 3 +1 votes. >> > > >> > > [ ] +1 Release this package as Apache Spark 3.5.8 >> > > [ ] -1 Do not release this package because ... >> > > >> > > To learn more about Apache Spark, please see >> https://spark.apache.org/ >> > > >> > > The tag to be voted on is v3.5.8-rc1 (commit 5a48a37b2db): >> > > https://github.com/apache/spark/tree/v3.5.8-rc1 >> > > >> > > The release files, including signatures, digests, etc. can be found >> at: >> > > https://dist.apache.org/repos/dist/dev/spark/v3.5.8-rc1-bin/ >> > > >> > > Signatures used for Spark RCs can be found in this file: >> > > https://downloads.apache.org/spark/KEYS >> > > >> > > The staging repository for this release can be found at: >> > > >> https://repository.apache.org/content/repositories/orgapachespark-1513/ >> > > >> > > The documentation corresponding to this release can be found at: >> > > https://dist.apache.org/repos/dist/dev/spark/v3.5.8-rc1-docs/ >> > > >> > > The list of bug fixes going into 3.5.8 can be found at the following >> URL: >> > > https://issues.apache.org/jira/projects/SPARK/versions/12356288 >> > > >> > > FAQ >> > > >> > > ========================= >> > > How can I help test this release? >> > > ========================= >> > > >> > > If you are a Spark user, you can help us test this release by taking >> > > an existing Spark workload and running on this release candidate, then >> > > reporting any regressions. >> > > >> > > If you're working in PySpark you can set up a virtual env and install >> > > the current RC via "pip install >> > > >> https://dist.apache.org/repos/dist/dev/spark/v3.5.8-rc1-bin/pyspark-3.5.8.tar.gz >> > > " >> > > and see if anything important breaks. >> > > In the Java/Scala, you can add the staging repository to your >> project's >> > > resolvers and test >> > > with the RC (make sure to clean up the artifact cache before/after so >> > > you don't end up building with an out of date RC going forward). >> > > >> > > --------------------------------------------------------------------- >> > > To unsubscribe e-mail: [email protected] >> > > >> > > >> > >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: [email protected] >> >>
