+1 Sent from my iPhone
> On Jan 4, 2026, at 11:41 PM, Yang Jie <[email protected]> wrote: > > +1 > >> On 2026/01/05 06:49:18 Xiao Li wrote: >> +1 >> >>> On Sun, Jan 4, 2026 at 22:45 Wenchen Fan <[email protected]> wrote: >>> >>> +1 >>> >>>> On Sun, Jan 4, 2026 at 10:47 AM Cheng Pan <[email protected]> wrote: >>> >>>> +1 (non-binding) >>>> >>>> Thanks, >>>> Cheng Pan >>>> >>>> >>>> >>>> On Jan 3, 2026, at 06:04, Hyukjin Kwon <[email protected]> wrote: >>>> >>>> Starting with my own +1 >>>> >>>> On Fri, Jan 2, 2026 at 11:45 PM <[email protected]> wrote: >>>> >>>>> Please vote on releasing the following candidate as Apache Spark version >>>>> 4.1.1. >>>>> >>>>> The vote is open until Mon, 05 Jan 2026 07:44:41 PST and passes if a >>>>> majority +1 PMC votes are cast, with >>>>> a minimum of 3 +1 votes. >>>>> >>>>> [ ] +1 Release this package as Apache Spark 4.1.1 >>>>> [ ] -1 Do not release this package because ... >>>>> >>>>> To learn more about Apache Spark, please see https://spark.apache.org/ >>>>> >>>>> The tag to be voted on is v4.1.1-rc2 (commit c0690c763ba): >>>>> https://github.com/apache/spark/tree/v4.1.1-rc2 >>>>> >>>>> The release files, including signatures, digests, etc. can be found at: >>>>> https://dist.apache.org/repos/dist/dev/spark/v4.1.1-rc2-bin/ >>>>> >>>>> Signatures used for Spark RCs can be found in this file: >>>>> https://downloads.apache.org/spark/KEYS >>>>> >>>>> The staging repository for this release can be found at: >>>>> https://repository.apache.org/content/repositories/orgapachespark-1510/ >>>>> >>>>> The documentation corresponding to this release can be found at: >>>>> https://dist.apache.org/repos/dist/dev/spark/v4.1.1-rc2-docs/ >>>>> >>>>> The list of bug fixes going into 4.1.1 can be found at the following URL: >>>>> https://issues.apache.org/jira/projects/SPARK/versions/12356469 >>>>> >>>>> FAQ >>>>> >>>>> ========================= >>>>> How can I help test this release? >>>>> ========================= >>>>> >>>>> If you are a Spark user, you can help us test this release by taking >>>>> an existing Spark workload and running on this release candidate, then >>>>> reporting any regressions. >>>>> >>>>> If you're working in PySpark you can set up a virtual env and install >>>>> the current RC via "pip install >>>>> https://dist.apache.org/repos/dist/dev/spark/v4.1.1-rc2-bin/pyspark-4.1.1.tar.gz >>>>> " >>>>> and see if anything important breaks. >>>>> In the Java/Scala, you can add the staging repository to your project's >>>>> resolvers and test >>>>> with the RC (make sure to clean up the artifact cache before/after so >>>>> you don't end up building with an out of date RC going forward). >>>>> >>>>> --------------------------------------------------------------------- >>>>> To unsubscribe e-mail: [email protected] >>>>> >>>>> >>>> >> > > --------------------------------------------------------------------- > To unsubscribe e-mail: [email protected] > --------------------------------------------------------------------- To unsubscribe e-mail: [email protected]
