[ 
https://issues.apache.org/jira/browse/FLINK-23978?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17704020#comment-17704020
 ] 

Alexey Novakov edited comment on FLINK-23978 at 3/23/23 9:42 AM:
-----------------------------------------------------------------

[~chesnay] Yes, I understood that latest Flink version does not have explicit 
support for Scala 2.13/3 and existing modules for Scala are based on 2.12 
version.

The small problem here is that a Flink user who wants to build new job in Scala 
2.13/3 has to keep the
{noformat}
"org.apache.flink" % "flink-streaming-scala_2.12" {noformat}
dependency in *runtime* to be able to run his Flink Job, because 
_DefaultScalaProductFieldAccessorFactory (Java class) is_ also needed for 
projects with newer Scala version for them to work. Otherwise, there is an 
exception thrown:
{code:java}
java.lang.ExceptionInInitializerError at Main$.<clinit>(main.scala:21) at 
Main.main(main.scala) at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.base/java.lang.reflect.Method.invoke(Method.java:568) at 
sbt.Run.invokeMain(Run.scala:143) at sbt.Run.execute$1(Run.scala:93) at 
sbt.Run.$anonfun$runWithLoader$5(Run.scala:120) at 
sbt.Run$.executeSuccess(Run.scala:186) at sbt.Run.runWithLoader(Run.scala:120) 
at sbt.Run.run(Run.scala:127) at 
com.olegych.scastie.sbtscastie.SbtScastiePlugin$$anon$1.$anonfun$run$1(SbtScastiePlugin.scala:38)
 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at 
sbt.util.InterfaceUtil$$anon$1.get(InterfaceUtil.scala:21) at 
sbt.ScastieTrapExit$App.run(ScastieTrapExit.scala:258) at 
java.base/java.lang.Thread.run(Thread.java:833) Caused by: 
java.lang.reflect.InaccessibleObjectException: Unable to make field private 
static final int java.lang.Class.ANNOTATION accessible: module java.base does 
not "opens java.lang" to unnamed module @1fe8cd8 at 
java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
 at 
java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
 at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:178) at 
java.base/java.lang.reflect.Field.setAccessible(Field.java:172) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:106) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69) at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:2194)
 at 
org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:201){code}
If that dependency is on classpath, then everything works fine. The only 
confusion here is that one needs to have that dependency in runtime as well.

I made a quick test by taking only that class and pasting into my codebase, 
that also works.


was (Author: novakov.alex):
[~chesnay] Yes, I understood that latest Flink version does not have explicit 
support for Scala 2.13/3 and existing modules for Scala are based on 2.12 
version.

The small problem here is that a Flink user who wants to build new job in Scala 
2.13/3 has to keep the
{noformat}
"org.apache.flink" % "flink-streaming-scala_2.12" {noformat}
dependency in *runtime* to be able to run his Flink Job, because 
_DefaultScalaProductFieldAccessorFactory (Java class) is_ also needed for 
projects with newer Scala version for them to work. Otherwise, there is an 
exception thrown:
{noformat}
java.lang.ExceptionInInitializerError at Main$.<clinit>(main.scala:21) at 
Main.main(main.scala) at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.base/java.lang.reflect.Method.invoke(Method.java:568) at 
sbt.Run.invokeMain(Run.scala:143) at sbt.Run.execute$1(Run.scala:93) at 
sbt.Run.$anonfun$runWithLoader$5(Run.scala:120) at 
sbt.Run$.executeSuccess(Run.scala:186) at sbt.Run.runWithLoader(Run.scala:120) 
at sbt.Run.run(Run.scala:127) at 
com.olegych.scastie.sbtscastie.SbtScastiePlugin$$anon$1.$anonfun$run$1(SbtScastiePlugin.scala:38)
 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at 
sbt.util.InterfaceUtil$$anon$1.get(InterfaceUtil.scala:21) at 
sbt.ScastieTrapExit$App.run(ScastieTrapExit.scala:258) at 
java.base/java.lang.Thread.run(Thread.java:833) Caused by: 
java.lang.reflect.InaccessibleObjectException: Unable to make field private 
static final int java.lang.Class.ANNOTATION accessible: module java.base does 
not "opens java.lang" to unnamed module @1fe8cd8 at 
java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
 at 
java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
 at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:178) at 
java.base/java.lang.reflect.Field.setAccessible(Field.java:172) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:106) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132) at 
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69) at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:2194)
 at 
org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:201){noformat}
If that dependency is on classpath, then everything works fine. The only 
confusion here is that one needs to have that dependency in runtime as well.

I made a quick test by taking only that class and pasting into my codebase, 
that also works.

> FieldAccessor has direct scala dependency
> -----------------------------------------
>
>                 Key: FLINK-23978
>                 URL: https://issues.apache.org/jira/browse/FLINK-23978
>             Project: Flink
>          Issue Type: Sub-task
>          Components: API / DataStream
>            Reporter: Chesnay Schepler
>            Assignee: Chesnay Schepler
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.15.0
>
>
> The FieldAccessor class in flink-streaming-java has a hard dependency on 
> scala. It would be ideal if we could restrict this dependencies to 
> flink-streaming-scala.
> We could move the SimpleProductFieldAccessor & RecursiveProductFieldAccessor 
> to flink-streaming-scala, and load them in the FieldAccessorFactory via 
> reflection.
> This is one of a few steps that would allow the Java Datastream API to be 
> used without scala being on the classpath.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to