+1 (non-binding)

Thanks for driving this release and the patience on multiple RCs!

On Tue, Sep 12, 2023 at 10:00 AM Yuanjian Li <xyliyuanj...@gmail.com> wrote:

> +1 (non-binding)
>
> Yuanjian Li <xyliyuanj...@gmail.com> 于2023年9月11日周一 09:36写道:
>
>> @Peter Toth <peter.t...@gmail.com> I've looked into the details of this
>> issue, and it appears that it's neither a regression in version 3.5.0 nor a
>> correctness issue. It's a bug related to a new feature. I think we can fix
>> this in 3.5.1 and list it as a known issue of the Scala client of Spark
>> Connect in 3.5.0.
>>
>> Mridul Muralidharan <mri...@gmail.com> 于2023年9月10日周日 04:12写道:
>>
>>>
>>> +1
>>>
>>> Signatures, digests, etc check out fine.
>>> Checked out tag and build/tested with -Phive -Pyarn -Pmesos -Pkubernetes
>>>
>>> Regards,
>>> Mridul
>>>
>>> On Sat, Sep 9, 2023 at 10:02 AM Yuanjian Li <xyliyuanj...@gmail.com>
>>> wrote:
>>>
>>>> Please vote on releasing the following candidate(RC5) as Apache Spark
>>>> version 3.5.0.
>>>>
>>>> The vote is open until 11:59pm Pacific time Sep 11th and passes if a
>>>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 3.5.0
>>>>
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>>
>>>> The tag to be voted on is v3.5.0-rc5 (commit
>>>> ce5ddad990373636e94071e7cef2f31021add07b):
>>>>
>>>> https://github.com/apache/spark/tree/v3.5.0-rc5
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>>
>>>> https://dist.apache.org/repos/dist/dev/spark/v3.5.0-rc5-bin/
>>>>
>>>> Signatures used for Spark RCs can be found in this file:
>>>>
>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>
>>>> The staging repository for this release can be found at:
>>>>
>>>> https://repository.apache.org/content/repositories/orgapachespark-1449
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>>
>>>> https://dist.apache.org/repos/dist/dev/spark/v3.5.0-rc5-docs/
>>>>
>>>> The list of bug fixes going into 3.5.0 can be found at the following
>>>> URL:
>>>>
>>>> https://issues.apache.org/jira/projects/SPARK/versions/12352848
>>>>
>>>> This release is using the release script of the tag v3.5.0-rc5.
>>>>
>>>>
>>>> FAQ
>>>>
>>>> =========================
>>>>
>>>> How can I help test this release?
>>>>
>>>> =========================
>>>>
>>>> If you are a Spark user, you can help us test this release by taking
>>>>
>>>> an existing Spark workload and running on this release candidate, then
>>>>
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>>
>>>> the current RC and see if anything important breaks, in the Java/Scala
>>>>
>>>> you can add the staging repository to your projects resolvers and test
>>>>
>>>> with the RC (make sure to clean up the artifact cache before/after so
>>>>
>>>> you don't end up building with an out of date RC going forward).
>>>>
>>>> ===========================================
>>>>
>>>> What should happen to JIRA tickets still targeting 3.5.0?
>>>>
>>>> ===========================================
>>>>
>>>> The current list of open tickets targeted at 3.5.0 can be found at:
>>>>
>>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>> Version/s" = 3.5.0
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>>
>>>> fixes, documentation, and API tweaks that impact compatibility should
>>>>
>>>> be worked on immediately. Everything else please retarget to an
>>>>
>>>> appropriate release.
>>>>
>>>> ==================
>>>>
>>>> But my bug isn't fixed?
>>>>
>>>> ==================
>>>>
>>>> In order to make timely releases, we will typically not hold the
>>>>
>>>> release unless the bug in question is a regression from the previous
>>>>
>>>> release. That being said, if there is something which is a regression
>>>>
>>>> that has not been correctly targeted please ping me or a committer to
>>>>
>>>> help target the issue.
>>>>
>>>> Thanks,
>>>>
>>>> Yuanjian Li
>>>>
>>>

Reply via email to