Hi, All.

I'd like to propose to start to prepare Apache Spark 4.0 after creating
branch-3.5 on July 16th.

- https://spark.apache.org/versioning-policy.html

Historically, the Apache Spark release dates have the following timeframes
and we already have Spark 3.5 plan which will be maintained up to 2026.

Spark 1: 2014.05 (1.0.0) ~ 2016.11 (1.6.3)
Spark 2: 2016.07 (2.0.0) ~ 2021.05 (2.4.8)
Spark 3: 2020.06 (3.0.0) ~ 2026.xx (3.5.x)
Spark 4: 2024.06 (4.0.0, NEW)

As we discussed in the previous email thread, `Apache Spark 3.5.0
Expectations`, we cannot deliver some features without Apache Spark 4.

- "I wonder if it’s safer to do it in Spark 4 (which I believe will be
discussed soon)."
- "I would make it the default at 4.0, myself."

Although there exist more other features, let's focus on Scala language
support history.

Spark 2.0: SPARK-6363 Make Scala 2.11 the default Scala version (2016.07)
Spark 3.0: SPARK-25956 Make Scala 2.12 as default Scala version in Spark
3.0 (2020.06)

In addition, the Scala community released Scala 3.3.0 LTS yesterday.

- https://scala-lang.org/blog/2023/05/30/scala-3.3.0-released.html

If we decide to start, I believe we can support Scala 2.13 or Scala 3.3
next year with Apache Spark 4 while supporting Spark 3.4 and 3.5 for Scala
2.12 users.

WDYT?

Thanks,
Dongjoon.

Reply via email to