Martijn,

I think we mean the same but using different words. Yes, there is a binary
incompatible change in Scala 2.12. This is also a significant road bump to
make a decision whether to upgrade Flink' Scala version. But there are
other issues identified by multiple people in that Jira ticket. I copy that
from Jira comments:

#2 - Flink makes extensive use of implicit type conversions.
#3 - `TraversableOnce` is gone in 2.13.
#4 - Collection conversion has changed.
#5 - Code generation has changed.
#6 - `Seq` in 2.12 and 2.13 mean different things.

Aris did try to address all of them here:
https://github.com/ariskk/flink/pull/1.
475 files to be changed.

Sorry, I did not want to pull all the history on Scala support in Flink in
this email thread. But I think I have to.

> Any user can still decide to use a newer version of Scala, by compiling
Flink with a newer Scala version.
They can compile
- if they use Scala API from Findify (or flink4s) OR
- they use Java API + their Scala code to provide those TypeInformation
instances for Scala case classes, Products and Scala collections

> they would have contributed more on the Scala wrapper
They contributed: Aris created Flink4s, Findify created flink-scala-api.
Look at how Flink's TypeInformation is derived/generated in these projects.
It is state of the art.

> The code example's readability ultimately becomes a matter of personal
preference
I agree. Just the final point, I see that one of the reasons Flink users
choose SQL over Java, because Flink SQL is more expressive. SQL API is not
as powerful as Java API, so Scala could be the best of the two.

I appreciate Chesnay's idea to fork the Findify project first under
flink-extended, we can try it, although it only partially solves the
original problem.

Best regards,
Alexey

On Mon, Apr 17, 2023 at 11:38 AM Martijn Visser <martijnvis...@apache.org>
wrote:

> Hi Alexey,
>
> I would argue that it's not a problem from Flink's source code, the
> problem was that Scala introduced a binary incompatible change in Scala
> 2.12.8. If Flink wanted to allow an upgrade, it would mean breaking
> snapshot compatibility. That's why Flink is still bound to be used with
> Scala 2.12.7. Any user can still decide to use a newer version of Scala, by
> compiling Flink with a newer Scala version.
>
> Given that Akka and Spark are predominantly built in Scala, I don't think
> they are comparable with Flink, being a Java-first application. I still
> would have expected that if the Scala type system and object serialization
> in Flink were a problem for the users, they would have contributed more on
> the Scala wrapper.
>
> The code example's readability ultimately becomes a matter of personal
> preference imho. I don't think that this is an argument we should use in
> the discussion.
>
> I would +1 Chesnay's idea to fork the Findify project first under
> flink-extended and have volunteers step up there. It makes it possible to
> mature the wrappers and see how it develops and gets used in the future.
>
> Best regards,
>
> Martijn
>
> On Mon, Apr 17, 2023 at 10:19 AM Alexey Novakov <ale...@ververica.com>
> wrote:
>
>> Hi Martijn,
>>
>> Thanks for your reply and attention.
>>
>> 1. As I read Nick's report here
>> https://issues.apache.org/jira/browse/FLINK-13414?focusedCommentId=17257763&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17257763
>> Scala maintainers were blocked by Flink's source code inability to migrate
>> from Scala 2.11 to newer versions easily. One strong reason is extensive
>> Scala Macros usage in Flink Scala API, so that eventually few other Scala
>> users developed 3-rd party Flink Wrappers on top of Java API once it became
>> possible.
>>
>> 2. Scala wrapper is still needed due to the Scala type system and object
>> serialization in Flink. You can not easily searilie Scala product type by
>> ONLY using Java API. Scala collection types also differ from standard Java
>> collections. If that would not be needed, I of course would not even start
>> this discussion and continue to use Java API from Scala. Same principles of
>> Scala and Java classes separation you can find in Akka and Apache Spark
>> code bases.
>>
>> 3. Another point I did not mention in the first email, the Scala code
>> examples look much more readable in Flink docs thanks to concise language
>> syntax. It would be very helpful to keep them in Flink and make sure they
>> work with Scala 2.13. and Scala 3. We would need to make sure if a user
>> uses Scala code example from Flink docs, it works with Scala latest version
>> without any issue. Otherwise, Scala users will have issues if they won't
>> use an extra Scala wrapper for Java API. If that Scala wrapper is not an
>> official part of Flink project, then it will be unsafe to use Scala at all.
>> Günter has mentioned about it in his reply as well.
>>
>> Best regards,
>> Alexey
>>
>> On Mon, Apr 17, 2023 at 9:27 AM Martijn Visser <martijnvis...@apache.org>
>> wrote:
>>
>>> Hi Alexey,
>>>
>>> > Taking into account my Scala experience for the last 8 years, I
>>> predict these wrappers will eventually be abandoned, unless such a Scala
>>> library is a part of some bigger community like ASF.
>>>
>>> For the past couple of years, there have been no maintainers for Scala
>>> in the Flink community. It was one of the reasons to deprecate the Scala
>>> APIs. Given that the wrappers don't seem to have taken off outside of
>>> Flink, why would moving them under the AS resolve this?
>>>
>>> > Also, non-official Scala API will lead people to play safe and choose
>>> Java API only, even if they did want that at the beginning.
>>>
>>> Why would that be a problem? Wouldn't the fact that there are no
>>> maintainers for the Scala wrappers actually indicate that Scala users are
>>> actually fine with using the Java APIs, because else there would have been
>>> improvements made towards the Scala wrappers?
>>>
>>> Best regards,
>>>
>>> Martijn
>>>
>>> On Sun, Apr 16, 2023 at 11:47 AM David Morávek <d...@apache.org> wrote:
>>>
>>>> cc dev@f.a.o
>>>>
>>>> On Sun, Apr 16, 2023 at 11:42 AM David Morávek <d...@apache.org> wrote:
>>>>
>>>> > Hi Alexey,
>>>> >
>>>> > I'm a bit skeptical because, looking at the project, I see a couple
>>>> of red
>>>> > flags:
>>>> >
>>>> > - The project is inactive. The last release and commit are both from
>>>> the
>>>> > last May.
>>>> > - The project has not been adapted for the last two Flink versions,
>>>> which
>>>> > signals a lack of users.
>>>> > - All commits are by a single person, which could mean that there is
>>>> no
>>>> > community around the project.
>>>> > - There was no external contribution (except the Scala bot).
>>>> > - There is no fork of the project (except the Scala bot).
>>>> >
>>>> > >  As I know, FIndify does not want or cannot maintain this library.
>>>> >
>>>> > Who are the users of the library? I'd assume Findify no longer uses
>>>> it if
>>>> > they're abandoning it.
>>>> >
>>>> > > which would be similar to the StateFun
>>>> >
>>>> > We're currently dealing with a lack of maintainers for StateFun, so we
>>>> > should have a solid building ground around the project to avoid the
>>>> same
>>>> > issue.
>>>> >
>>>> >
>>>> > I think there is value in having a modern Scala API, but we should
>>>> have a
>>>> > bigger plan to address the future of Flink Scala APIs than importing
>>>> an
>>>> > unmaintained library and calling it a day. I suggest starting a
>>>> thread on
>>>> > the dev ML and concluding the overall plan first.
>>>> >
>>>> > Best,
>>>> > D.
>>>> >
>>>> > On Sun, Apr 16, 2023 at 10:48 AM guenterh.lists <
>>>> guenterh.li...@bluewin.ch>
>>>> > wrote:
>>>> >
>>>> >> Hello Alexey
>>>> >>
>>>> >> Thank you for your initiative and your suggestion!
>>>> >>
>>>> >> I can only fully support the following statements in your email:
>>>> >>
>>>> >>  >Taking into account my Scala experience for the last 8 years, I
>>>> >> predict these wrappers will eventually be abandoned, unless such a
>>>> Scala
>>>> >> library is a part of some bigger community like ASF.
>>>> >>  >Also, non-official Scala API will lead people to play safe and
>>>> choose
>>>> >> Java API only, even if they didn't want that at the beginning.
>>>> >>
>>>> >> Second sentence is my current state.
>>>> >>
>>>> >>  From my point of view it would be very unfortunate if the Flink
>>>> project
>>>> >> would lose the Scala API and thus the integration of concise,
>>>> flexible
>>>> >> and future-oriented language constructs of the Scala language (and
>>>> >> further development of version 3).
>>>> >>
>>>> >> Documentation of the API is essential. I would be interested to
>>>> support
>>>> >> this efforts.
>>>> >>
>>>> >> Best wishes
>>>> >>
>>>> >> Günter
>>>> >>
>>>> >>
>>>> >> On 13.04.23 15:39, Alexey Novakov via user wrote:
>>>> >> > Hello Flink PMCs and Flink Scala Users,
>>>> >> >
>>>> >> > I would like to propose an idea to take the 3rd party Scala API
>>>> >> > findify/flink-scala-api <
>>>> https://github.com/findify/flink-scala-api>
>>>> >> > project into the Apache Flink organization.
>>>> >> >
>>>> >> > *Motivation *
>>>> >> >
>>>> >> > The Scala-free Flink idea was finally implemented by the 1.15
>>>> release
>>>> >> and
>>>> >> > allowed Flink users to bring their own Scala version and use it
>>>> via the
>>>> >> > Flink Java API. See blog-post here: Scala Free in One Fifteen
>>>> >> > <https://flink.apache.org/2022/02/22/scala-free-in-one-fifteen/>.
>>>> Also,
>>>> >> > existing Flink Scala API will be deprecated, because it is too
>>>> hard to
>>>> >> > upgrade it to Scala 2.13 or 3.
>>>> >> >
>>>> >> > Taking into account my Scala experience for the last 8 years, I
>>>> predict
>>>> >> > these wrappers will eventually be abandoned, unless such a Scala
>>>> >> library is
>>>> >> > a part of some bigger community like ASF.
>>>> >> > Also, non-official Scala API will lead people to play safe and
>>>> choose
>>>> >> Java
>>>> >> > API only, even if they did want that at the beginning.
>>>> >> >
>>>> >> > https://github.com/findify/flink-scala-api has already advanced
>>>> and
>>>> >> > implemented Scala support for 2.13 and 3 versions on top of Flink
>>>> Java
>>>> >> API.
>>>> >> > As I know, FIndify does not want or does not have a capacity to
>>>> maintain
>>>> >> > this library. I propose to fork this great library and create a new
>>>> >> Flink
>>>> >> > project with its own version and build process (SBT, not Maven),
>>>> which
>>>> >> > would be similar to the StateFun or FlinkML projects.
>>>> >> >
>>>> >> > *Proposal *
>>>> >> >
>>>> >> > 1. Create a fork of findify/flink-scala-api and host in Apache
>>>> Flink Git
>>>> >> > space (PMCs please advise).
>>>> >> > 2. I and Roman
>>>> >> > <
>>>> >>
>>>> https://issues.apache.org/jira/secure/ViewProfile.jspa?name=rgrebennikov
>>>> >
>>>> >> > would
>>>> >> > be willing to maintain this library in future for the next several
>>>> >> years.
>>>> >> > Further, we believe it will live on its own.
>>>> >> > 3. Flink Docs: PMCs, we need your guidelines here. One way I see
>>>> is to
>>>> >> > create new documentation in a similar way as StateFun docs.
>>>> >> Alternatively,
>>>> >> > we could just fix existing Flink Scala code examples to make sure
>>>> they
>>>> >> work
>>>> >> > with the new wrapper. In any case, I see docs will be
>>>> upgraded/fixed
>>>> >> > gradually.
>>>> >> >
>>>> >> > I hope you will find this idea interesting and worth going forward.
>>>> >> >
>>>> >> > P.S. The irony here is that findify/flink-scala-api was also a
>>>> fork of
>>>> >> > Flink Scala-API some time ago, so we have a chance to close the
>>>> loop :-)
>>>> >> >
>>>> >> > Best regards.
>>>> >> > Alexey
>>>> >> >
>>>> >> --
>>>> >> Günter Hipler
>>>> >> https://openbiblio.social/@vog61
>>>> >> https://twitter.com/vog61
>>>> >>
>>>> >>
>>>>
>>>

Reply via email to