[ 
https://issues.apache.org/jira/browse/FLINK-33949?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17800913#comment-17800913
 ] 

Wencong Liu edited comment on FLINK-33949 at 12/28/23 3:45 AM:
---------------------------------------------------------------

Suppose we have two completely independent interfaces, I and J, both declaring 
a default method M with the same signature. Now, if there is a class T that 
implements both interfaces I and J but *does not override* the conflicting 
method M, the compiler would not know which interface's default method 
implementation to use, as they both have equal priority. If the code containing 
class T tries to invoke this method at runtime, the JVM would throw an 
{{IncompatibleClassChangeError}} because it is faced with an impossible 
decision: it does not know which interface’s default implementation to call.

However, if M is abstract in I or J, the implementation class T *must* provides 
an explicit implementation of the method. So no matter how interfaces I or J 
change (as long as the signature of their method  M does not change), it will 
not affect the behavior of the implementation class T or cause an 
{{{}IncompatibleClassChangeError{}}}. Class T will continue to use its own 
method M implementation, disregarding any default implementations from the two 
interfaces.

 

I have create a test case, where the StreamingRuntimeContext will be added a 
method return TestObject:
{code:java}
public class TestObject implements TestInterface1, TestInterface2 {
    @Override
    public String getResult() {
        return "777";
    }
} 
public interface TestInterface1 {
    String getResult();
}
public interface TestInterface2 {
    default String getResult() {
        return "666";
    }
}{code}
The job code is in the follows. The job is compiled with the modifiled 
StreamingRuntimeContext in Flink.
{code:java}
public static void main(String[] args) throws Exception {
    StreamExecutionEnvironment executionEnvironment = 
StreamExecutionEnvironment.getExecutionEnvironment();
    DataStreamSource<Integer> source =
            executionEnvironment.fromData(3, 2, 1, 4, 5, 6, 7, 8);
    SingleOutputStreamOperator<String> result = source.map(new 
RichMapFunction<Integer, String>() {
        @Override
        public String map(Integer integer) {
            StreamingRuntimeContext runtimeContext = 
(StreamingRuntimeContext)getRuntimeContext();
            return runtimeContext.getTestObject().getResult();
        }
    });
    CloseableIterator<String> jobResult = result.executeAndCollect();
    while (jobResult.hasNext())
        System.out.println(jobResult.next());
} {code}
When I change the abstract method getResult into default in TestInterface1 and 
recompiled Flink. The job is still able to finish without any code changes and 
exceptions.

Therefore, I think the METHOD_ABSTRACT_NOW_DEFAULT doesn't break source 
compatibility. WDYT? [~martijnvisser] 

 


was (Author: JIRAUSER281639):
Suppose we have two completely independent interfaces, I and J, both declaring 
a default method M with the same signature. Now, if there is a class T that 
implements both interfaces I and J but *does not override* the conflicting 
method M, the compiler would not know which interface's default method 
implementation to use, as they both have equal priority. If the code containing 
class T tries to invoke this method at runtime, the JVM would throw an 
{{IncompatibleClassChangeError}} because it is faced with an impossible 
decision: it does not know which interface’s default implementation to call.

However, if M is abstract in I or J, the implementation class T *must* provides 
an explicit implementation of the method. So no matter how interfaces I or J 
change (as long as the signature of their method  M does not change), it will 
not affect the behavior of the implementation class T or cause an 
{{{}IncompatibleClassChangeError{}}}. Class T will continue to use its own 
method M implementation, disregarding any default implementations from the two 
interfaces.

 

I have create a test case, where the StreamingRuntimeContext will be added a 
method return TestObject:
{code:java}
public class TestObject implements TestInterface1, TestInterface2 {
    @Override
    public String getResult() {
        return "777";
    }
} 
public interface TestInterface1 {
    String getResult();
}
public interface TestInterface2 {
    default String getResult() {
        return "666";
    }
}{code}
The job code is in the follows. The job is compiled with the modifiled 
StreamingRuntimeContext in Flink.
{code:java}
public static void main(String[] args) throws Exception {
    StreamExecutionEnvironment executionEnvironment = 
StreamExecutionEnvironment.getExecutionEnvironment();
    DataStreamSource<Integer> source =
            executionEnvironment.fromData(3, 2, 1, 4, 5, 6, 7, 8);
    SingleOutputStreamOperator<String> result = source.map(new 
RichMapFunction<Integer, String>() {
        @Override
        public String map(Integer integer) {
            StreamingRuntimeContext runtimeContext = 
(StreamingRuntimeContext)getRuntimeContext();
            return runtimeContext.getTestObject().getResult();
        }
    });
    CloseableIterator<String> jobResult = result.executeAndCollect();
    while (jobResult.hasNext())
        System.out.println(jobResult.next());
} {code}
When I change the abstract method getResult into default in TestInterface1 and 
recompiled Flink. The job is still able to finish without any code changes and 
exceptions.

Therefore, I think the METHOD_ABSTRACT_NOW_DEFAULT doesn't break source 
compatibility. WDYT?

 

> METHOD_ABSTRACT_NOW_DEFAULT should be both source compatible and binary 
> compatible
> ----------------------------------------------------------------------------------
>
>                 Key: FLINK-33949
>                 URL: https://issues.apache.org/jira/browse/FLINK-33949
>             Project: Flink
>          Issue Type: Bug
>          Components: Test Infrastructure
>    Affects Versions: 1.19.0
>            Reporter: Wencong Liu
>            Priority: Major
>             Fix For: 1.19.0
>
>
> Currently  I'm trying to refactor some APIs annotated by @Public in 
> [FLIP-382: Unify the Provision of Diverse Metadata for Context-like APIs - 
> Apache Flink - Apache Software 
> Foundation|https://cwiki.apache.org/confluence/display/FLINK/FLIP-382%3A+Unify+the+Provision+of+Diverse+Metadata+for+Context-like+APIs].
>  When an abstract method is changed into a default method, the japicmp maven 
> plugin names this change METHOD_ABSTRACT_NOW_DEFAULT and considers it as 
> source incompatible and binary incompatible.
> The reason maybe that if the abstract method becomes default, the logic in 
> the default method will be ignored by the previous implementations.
> I create a test case in which a job is compiled with newly changed default 
> method and submitted to the previous version. There is no exception thrown. 
> Therefore, the METHOD_ABSTRACT_NOW_DEFAULT shouldn't be incompatible both for 
> source and binary. We could add the following settings to override the 
> default values for binary and source compatibility, such as:
> {code:java}
> <overrideCompatibilityChangeParameters>
>     <overrideCompatibilityChangeParameter>
>        <compatibilityChange>METHOD_ABSTRACT_NOW_DEFAULT</compatibilityChange>
>        <binaryCompatible>true</binaryCompatible>
>        <sourceCompatible>true</sourceCompatible>
>     </overrideCompatibilityChangeParameter>
> </overrideCompatibilityChangeParameters> {code}
> By the way, currently the master branch checks both source compatibility and 
> binary compatibility between minor versions. According to Flink's API 
> compatibility constraints, the master branch shouldn't check binary 
> compatibility. There is already jira FLINK-33009 to track it and we should 
> fix it as soon as possible.
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to