This vote failed. I'll cut RC2 later this week. Thanks to everyone
for participating!

On Sat, Feb 22, 2025 at 12:53 AM Max Gekk <max.g...@gmail.com> wrote:

> -1, need a proper error message for not fully implemented feature:
> https://issues.apache.org/jira/browse/SPARK-51289
>
> On Fri, Feb 21, 2025 at 4:11 PM Adam Binford <adam...@gmail.com> wrote:
>
>> Is there also supposed to be a pyspark-client package released? Don't see
>> that in the dist.
>>
>> On Fri, Feb 21, 2025 at 2:23 AM Hyukjin Kwon <gurwls...@apache.org>
>> wrote:
>>
>>> Yeah I would like people to test this out. Let's wait bit more ..
>>> On Fri, Feb 21, 2025 at 9:02 AM Mich Talebzadeh <
>>> mich.talebza...@gmail.com> wrote:
>>>
>>>>
>>>>    -
>>>>
>>>>    The purpose of an RC is to provide a near-final version for testing
>>>>    and validation, not a fully-fledged release.
>>>>    -
>>>>
>>>>    Personally, I tend to wait for the final release, as RCs are
>>>>    primarily valuable for early testers and those who want to help identify
>>>>    last-minute issues.
>>>>    -
>>>>
>>>>    RC1 is typically followed by a sequence of additional RCs (e.g.,
>>>>    RC2, RC3) as needed, until all blockers are resolved and the final 
>>>> release
>>>>    is ready.
>>>>
>>>> HTH
>>>>
>>>> Dr Mich Talebzadeh,
>>>> Architect | Data Science | Financial Crime | Forensic Analysis | GDPR
>>>>
>>>>    view my Linkedin profile
>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, 20 Feb 2025 at 22:20, Jungtaek Lim <
>>>> kabhwan.opensou...@gmail.com> wrote:
>>>>
>>>>> -1 (non-binding) as we have a couple live blockers.
>>>>>
>>>>> I know RC1 is mostly for everyone to audit the release artifacts
>>>>> earlier before everything is ready, but just wanted to make clear that we
>>>>> expect to have another RC.
>>>>>
>>>>> On Wed, Feb 19, 2025 at 7:26 PM Mich Talebzadeh <
>>>>> mich.talebza...@gmail.com> wrote:
>>>>>
>>>>>> +1
>>>>>>
>>>>>> Dr Mich Talebzadeh,
>>>>>> Architect | Data Science | Financial Crime | Forensic Analysis | GDPR
>>>>>>
>>>>>>    view my Linkedin profile
>>>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, 19 Feb 2025 at 09:31, Wenchen Fan <cloud0...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>> version 4.0.0.
>>>>>>>
>>>>>>> The vote is open until February 21 (PST) and passes if a majority +1
>>>>>>> PMC votes are cast, with a minimum of 3 +1 votes.
>>>>>>>
>>>>>>> [ ] +1 Release this package as Apache Spark 4.0.0
>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>
>>>>>>> To learn more about Apache Spark, please see
>>>>>>> https://spark.apache.org/
>>>>>>>
>>>>>>> The tag to be voted on is v4.0.0-rc1 (commit
>>>>>>> 860bd5e93d659852c0e80a82d4494d0e9548fde5)
>>>>>>> https://github.com/apache/spark/tree/v4.0.0-rc1
>>>>>>>
>>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>>> at:
>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc1-bin/
>>>>>>>
>>>>>>> Signatures used for Spark RCs can be found in this file:
>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>
>>>>>>> The staging repository for this release can be found at:
>>>>>>>
>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1475/
>>>>>>>
>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc1-docs/
>>>>>>>
>>>>>>> The list of bug fixes going into 4.0.0 can be found at the following
>>>>>>> URL:
>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12353359
>>>>>>>
>>>>>>> This release is using the release script of the tag v4.0.0-rc1.
>>>>>>>
>>>>>>> FAQ
>>>>>>>
>>>>>>> =========================
>>>>>>> How can I help test this release?
>>>>>>> =========================
>>>>>>>
>>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>>> an existing Spark workload and running on this release candidate,
>>>>>>> then
>>>>>>> reporting any regressions.
>>>>>>>
>>>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>>>> the current RC and see if anything important breaks, in the
>>>>>>> Java/Scala
>>>>>>> you can add the staging repository to your projects resolvers and
>>>>>>> test
>>>>>>> with the RC (make sure to clean up the artifact cache before/after so
>>>>>>> you don't end up building with a out of date RC going forward).
>>>>>>>
>>>>>>
>>
>> --
>> Adam Binford
>>
>

Reply via email to