Thank you, Yangjie.

Yes, instead of switching, it will depend on the Scala community's promised
compatibility completely.

However, many other difficulties still arise from the 3rd-party libraries
like Ammonite. Scala 3 might be better in long term perspective.

I'll send another email about the community Scala issues.

Dongjoon.



On Fri, Jun 2, 2023 at 2:48 AM yangjie01 <yangji...@baidu.com> wrote:

> +1,Agree to start to prepare Apache Spark 4.0 after creating branch-3.5
> on July 16th.
>
>
>
> As I am not yet familiar with Scala 3, I am unable to make good
> suggestions for choosing the Scala version.
>
>
>
> But I want to know if Spark 4.0 chooses to use the Scala 2.13.x, is it
> impossible to switch Scala 3.x as the default version during the lifecycle
> of Spark 4.x?
>
>
>
> Thanks
>
> Yang Jie
>
>
>
> *发件人**: *Dongjoon Hyun <dongj...@apache.org>
> *日期**: *2023年6月1日 星期四 09:03
> *收件人**: *dev <dev@spark.apache.org>
> *主题**: *Apache Spark 4.0 Timeframe?
>
>
>
> Hi, All.
>
> I'd like to propose to start to prepare Apache Spark 4.0 after creating
> branch-3.5 on July 16th.
>
> - https://spark.apache.org/versioning-policy.html
> <https://mailshield.baidu.com/check?q=vctdW221Qpw1RkBH%2baEmJJYTNfyfKdQth%2bXybE4CmiqqiIKDFEKSh0Cm4i6jy9T6ROH2vg%3d%3d>
>
> Historically, the Apache Spark release dates have the following timeframes
> and we already have Spark 3.5 plan which will be maintained up to 2026.
>
> Spark 1: 2014.05 (1.0.0) ~ 2016.11 (1.6.3)
> Spark 2: 2016.07 (2.0.0) ~ 2021.05 (2.4.8)
> Spark 3: 2020.06 (3.0.0) ~ 2026.xx (3.5.x)
> Spark 4: 2024.06 (4.0.0, NEW)
>
> As we discussed in the previous email thread, `Apache Spark 3.5.0
> Expectations`, we cannot deliver some features without Apache Spark 4.
>
> - "I wonder if it’s safer to do it in Spark 4 (which I believe will be
> discussed soon)."
> - "I would make it the default at 4.0, myself."
>
> Although there exist more other features, let's focus on Scala language
> support history.
>
> Spark 2.0: SPARK-6363 Make Scala 2.11 the default Scala version (2016.07)
> Spark 3.0: SPARK-25956 Make Scala 2.12 as default Scala version in Spark
> 3.0 (2020.06)
>
> In addition, the Scala community released Scala 3.3.0 LTS yesterday.
>
> - https://scala-lang.org/blog/2023/05/30/scala-3.3.0-released.html
> <https://mailshield.baidu.com/check?q=I4UJVy3PdUFSU6r495P905XswiaNsj37cgObnPREo6HrPJsFLSIOaS5E9Axx%2ftIaBSbs0VK9C4MpgDtorx2cAVibNDQ%3d>
>
> If we decide to start, I believe we can support Scala 2.13 or Scala 3.3
> next year with Apache Spark 4 while supporting Spark 3.4 and 3.5 for Scala
> 2.12 users.
>
> WDYT?
>
> Thanks,
> Dongjoon.
>

Reply via email to