och5351 commented on issue #10353:
URL: https://github.com/apache/seatunnel/issues/10353#issuecomment-3784918926

   Hi there! Apologies for the late response.
   
   Flink 2.0.1 API differs significantly from previous versions. It appears to 
use 
   the local or environment-specific Flink version.
   
   <img width="777" height="535" alt="Image" 
src="https://github.com/user-attachments/assets/2d3d6838-e644-40f6-b675-6d530c5b5fdf";
 />
   
   However, SeaTunnel does not yet have a Flink 2.0 starter. When attempting to 
use 
   Flink 2.0+ with a Flink 1.20.3 client, you encounter errors like these:
   
   ## sql-gateway
   
   ```plaintext
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in 
[jar:file:/Users/kia2948351/Desktop/develop/flink-1.20.3/lib/log4j-slf4j-impl-2.24.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in 
[jar:file:/Users/kia2948351/Desktop/develop/hadoop-3.3.4/share/hadoop/common/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
   SLF4J: Actual binding is of type 
[org.apache.logging.slf4j.Log4jLoggerFactory]
   
   
   Exception in thread "main" org.apache.flink.table.client.SqlClientException: 
Failed to create the executor.
           at 
org.apache.flink.table.client.gateway.ExecutorImpl.<init>(ExecutorImpl.java:230)
           at 
org.apache.flink.table.client.gateway.ExecutorImpl.<init>(ExecutorImpl.java:148)
           at 
org.apache.flink.table.client.gateway.Executor.create(Executor.java:49)
           at org.apache.flink.table.client.SqlClient.start(SqlClient.java:90)
           at 
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:228)
           at org.apache.flink.table.client.SqlClient.main(SqlClient.java:179)
   Caused by: java.lang.IllegalArgumentException: No enum constant 
org.apache.flink.table.gateway.rest.util.SqlGatewayRestAPIVersion.V4
           at java.base/java.lang.Enum.valueOf(Enum.java:273)
           at 
org.apache.flink.table.gateway.rest.util.SqlGatewayRestAPIVersion.valueOf(SqlGatewayRestAPIVersion.java:38)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
           at 
org.apache.flink.table.client.gateway.ExecutorImpl.negotiateVersion(ExecutorImpl.java:559)
           at 
org.apache.flink.table.client.gateway.ExecutorImpl.<init>(ExecutorImpl.java:193)
           ... 5 more
   ```
   
   ## flink api
   
   ```
   Caused by: java.io.InvalidClassException: 
org.apache.flink.table.runtime.typeutils.InternalTypeInfo; local class 
incompatible: stream classdesc serialVersionUID = 4262328858682975439, local 
class serialVersionUID = 5235578560101400674
           at java.base/java.io.ObjectStreamClass.initNonProxy(Unknown Source)
           at java.base/java.io.ObjectInputStream.readNonProxyDesc(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readClassDesc(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream$FieldValues.<init>(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream$FieldValues.<init>(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream$FieldValues.<init>(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream$FieldValues.<init>(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream.readObject(Unknown Source)
           at java.base/java.io.ObjectInputStream.readObject(Unknown Source)
           at 
org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:488)
           at 
org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:472)
           at 
org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:467)
           at 
org.apache.flink.util.SerializedValue.deserializeValue(SerializedValue.java:67)
           at 
org.apache.flink.runtime.operators.coordination.OperatorCoordinatorHolder.create(OperatorCoordinatorHolder.java:497)
           at 
org.apache.flink.runtime.executiongraph.ExecutionJobVertex.createOperatorCoordinatorHolder(ExecutionJobVertex.java:313)
           at 
org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:175)
           ... 20 more
   ```
   
   Whenever connector dependencies differ between versions:
   
   example JDBC:
   
   **Flink 1.20 requires:**
     * 
https://repo1.maven.org/maven2/org/apache/flink/flink-connector-jdbc/3.3.0-1.20/flink-connector-jdbc-3.3.0-1.20.jar
     * 
https://repo1.maven.org/maven2/org/postgresql/postgresql/42.5.4/postgresql-42.5.4.jar
   **Flink 2.0 requires:**
     * 
https://repo1.maven.org/maven2/org/apache/flink/flink-connector-jdbc-core/4.0.0-2.0/flink-connector-jdbc-core-4.0.0-2.0.jar
     *  
https://repo1.maven.org/maven2/org/apache/flink/flink-connector-jdbc-postgres/4.0.0-2.0/flink-connector-jdbc-postgres-4.0.0-2.0.jar
     * 
https://repo1.maven.org/maven2/org/postgresql/postgresql/42.7.8/postgresql-42.7.8.jar
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to