Hi all!

I'd like to ask for an opinion and discuss the next thing:
at this moment in general Spark could be built with Scala 2.11 and 2.12
(mostly), and close to the point to have support for Scala 2.13. On the
other hand, Scala 3 is going into pre-release phase (with 3.0.0-M1 released
at the beginning of October).

Previously, support of current Scala version by Spark was a bit behind of
desired state, dictated by all circumstances. To move things differently
with Scala 3 I'd like to contribute my efforts (and help others if there
would be any) to support it starting as soon as possible (ie to have Spark
build compiled with Scala 3 and to have release artifacts when it would be
possible).

I suggest that it would require to add experimental profile to the build
file so further changes to compile, test and run other tasks could be done
in incremental manner (with respect to compatibility with current code for
versions 2.12 and 2.13 and backporting where possible). I'd like to do it
that way since I do not represent any company, contribute in my own time
and thus cannot guarantee consistent time spent on this (so just in case of
anything such contribution would not be left in fork repo).

In fact, with recent changes to move Spark build to use latest SBT, such
starting changes are pretty small on SBT side (about 10 LOC) and I was
already able to see how build fails with Scala 3 compiler :)

To summarize:
1. is this approach suitable for project at this moment, so it would be
accepted and accounted in release schedule (in 2021 I assume)?
2. how should it be filed, as an umbrella Jira ticket with minor tasks or
as a SPIP at first with more thorough analysis?

Reply via email to