Thank you, Holden, Liang-Chi, Huaxin, Jungtaek.

We included SPARK-49829, SPARK-50021, SPARK-50022 and reverted SPARK-49909 and 
SPARK-50011 to stabilize `branch-3.4`. As of now, all branch-3.4 CIs are 
healthy.

Commit Test: https://github.com/apache/spark/tree/branch-3.4
Scala 2.13 Test: 
https://github.com/apache/spark/actions/workflows/build_branch34.yml
Python Test: 
https://github.com/apache/spark/actions/workflows/build_branch34_python.yml

Since there is no other opinion, I'll prepare and start the Apache Spark 3.4.4 
RC1 on next Monday (2024-10-21). Thank you again for your help.

Dongjoon.

On 2024/10/17 02:44:35 Jungtaek Lim wrote:
> There is another open correctness issue for the 3.4 version line - PR is up
> and approved by a non-committer, and I'm struggling to find a committer to
> review and approve.
> 
> Issue: https://issues.apache.org/jira/browse/SPARK-49829
> PR: https://github.com/apache/spark/pull/48297
> 
> I'd propose to include this PR into Spark 3.4.4 before calling the 3.4
> version line to be EOL.
> 
> @Liang-Chi Hsieh <vii...@gmail.com> I hope you would be willing to review
> the PR. Thanks in advance.
> 
> 2024년 10월 17일 (목) 오전 6:16, huaxin gao <huaxin.ga...@gmail.com>님이 작성:
> 
> > +1
> >
> > On Wed, Oct 16, 2024 at 1:53 PM L. C. Hsieh <vii...@gmail.com> wrote:
> >
> >> +1
> >>
> >> Thanks Dongjoon.
> >>
> >>
> >> On Wed, Oct 16, 2024 at 11:41 AM Holden Karau <holden.ka...@gmail.com>
> >> wrote:
> >> >
> >> > +1 on a 3.4.4 EOL release
> >> >
> >> > On Wed, Oct 16, 2024 at 9:37 AM Dongjoon Hyun <dongjoon.h...@gmail.com>
> >> wrote:
> >> >>
> >> >> Hi, All.
> >> >>
> >> >> Since the Apache Spark 3.4.0 RC7 vote passed on Apr 6, 2023,
> >> branch-3.4 has been maintained and served well until now.
> >> >>
> >> >> - https://github.com/apache/spark/releases/tag/v3.4.0 (tagged on Apr
> >> 6, 2023)
> >> >> - https://lists.apache.org/thread/0o61jn9cmg6r0f22ljgjg5c31z8fn0zn
> >> (vote result on April 13th, 2023)
> >> >>
> >> >> As of today, branch-3.4 has 100 additional patches after v3.4.3
> >> (tagged on April 14th about 6 month ago) and reaches the end-of-life this
> >> month according to the Apache Spark release cadence,
> >> https://spark.apache.org/versioning-policy.html .
> >> >>
> >> >> $ git log --oneline v3.4.3..HEAD | wc -l
> >> >>      100
> >> >>
> >> >> Moreover, there are seven unreleased correctness patches.
> >> >>
> >> >> SPARK-47927 Nullability after join not respected in UDF
> >> >> SPARK-48019 ColumnVectors with dictionaries and nulls are not
> >> read/copied correctly
> >> >> SPARK-48037 SortShuffleWriter lacks shuffle write related metrics
> >> resulting in potentially inaccurate data
> >> >> SPARK-48105 Fix the data corruption issue when state store unload and
> >> snapshotting happens concurrently for HDFS state store
> >> >> SPARK-48965 toJSON produces wrong values if DecimalType information is
> >> lost in as[Product]
> >> >> SPARK-49000 Aggregation with DISTINCT gives wrong results when dealing
> >> with literals
> >> >> SPARK-49836 The outer query is broken when the subquery uses window
> >> function which receives time window as parameter
> >> >>
> >> >> Along with the recent Apache Spark 4.0.0-preview2 and 3.5.3 releases,
> >> I hope the users can get a chance to have these last bits of Apache Spark
> >> 3.4.x, and I'd like to propose to have Apache Spark 3.4.4 EOL Release vote
> >> on October 21th and volunteer as the release manager.
> >> >>
> >> >> WDTY?
> >> >>
> >> >> Please let us know if you need more patches on branch-3.4.
> >> >>
> >> >> Thanks,
> >> >> Dongjoon.
> >> >
> >> >
> >> >
> >> > --
> >> > Twitter: https://twitter.com/holdenkarau
> >> > Fight Health Insurance: https://www.fighthealthinsurance.com/
> >> > Books (Learning Spark, High Performance Spark, etc.):
> >> https://amzn.to/2MaRAG9
> >> > YouTube Live Streams: https://www.youtube.com/user/holdenkarau
> >> > Pronouns: she/her
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
> >>
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to