zhang5059T opened a new issue, #7011:
URL: https://github.com/apache/seatunnel/issues/7011

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   ## 环境说明
    使用 spark-operator 来跑Seatunnel项目。
   
   ### SeaTunnel Version
   
   2.3.5
   
   ### SeaTunnel Config
   
   ```conf
   env {
     parallelism = 8
   }
   source {
       Jdbc {
           url = "jdbc:oracle:thin:@x.x.x.x:1521:orcl"
           driver = "oracle.jdbc.OracleDriver"
           user = "seatunnel"
           password = "seatunnel123456"
           query = "select * from seatunnel.FA_DATA_23_15"
       }
   }
   sink {
       Jdbc {
           url = "jdbc:oracle:thin:@x.x.x.x:1521:orcl"
           driver = "oracle.jdbc.OracleDriver"
           user = "seatunnel01"
           password = "seatunnel123456"
                   generate_sink_sql = true
           database = ORCL
           table = "FA_DATA_23_15"
       }
   }
   ```
   
   
   ### Running Command
   
   ```shell
   apiVersion: "sparkoperator.k8s.io/v1beta2"
   kind: SparkApplication
   metadata:
     name: spark-seatunnel
     namespace: spark-operator
   spec:
     type: Scala
     mode: cluster
     image: "datawork/spark3.3.0-seatunnal2.3.5:v2-r1"
     mainClass: org.apache.seatunnel.core.starter.spark.SeaTunnelSpark
     arguments: ["--config", "/data/v2.oracle.conf"]
     mainApplicationFile: 
"http://x.x.x.x:32175/spark/seatunnel/seatunnel-spark-3-starter-1.jar";
     imagePullPolicy: Always
     sparkVersion: "3.3.0"
     restartPolicy:
       type: Never
     volumes:
       - name: seatunnel-oracle
         configMap:
           name: seatunnel-oracle
           items:
             - key: v2.oracle.conf
               path: v2.oracle.conf
     deps:
             jars: 
["local:///opt/seatunnel/lib/ojdbc8-23.4.0.24.05.jar","local:///opt/seatunnel/lib/seatunnel-transforms-v2.jar"]
     driver:
       cores: 1
       coreLimit: "1200m"
       memory: "512m"
       labels:
         version: 3.3.0
       serviceAccount: spark-release-spark-operator
       env:
         - name: SEATUNNEL_HOME
           value: "/opt/seatunnel"
       volumeMounts:
         - name: seatunnel-oracle
           mountPath: /data/v2.oracle.conf
           subPath: v2.oracle.conf
     executor:
       cores: 1
       instances: 1
       memory: "512m"
       env:
         - name: SEATUNNEL_HOME
           value: "/opt/seatunnel"
       labels:
         version: 3.3.0
       volumeMounts:
         - name: seatunnel-oracle
           mountPath: /data/v2.oracle.conf
           subPath: v2.oracle.conf
   ```
   
   
   ### Error Exception
   
   ```log
   ### Exception
   24/06/18 07:29:59 INFO DAGScheduler: ResultStage 0 (save at 
SinkExecuteProcessor.java:162) failed in 0.784 s due to Job aborted due to 
stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost 
task 0.3 in stage 0.0 (TID 6) (10.16.59.102 executor 1): 
java.lang.ClassCastException: cannot assign instance of 
scala.collection.immutable.List$SerializationProxy to field 
org.apache.spark.rdd.RDD.dependencies_ of type scala.collection.Seq in instance 
of org.apache.spark.rdd.MapPartitionsRDD
           at 
java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(Unknown 
Source)
           at 
java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(Unknown
 Source)
           at 
java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(Unknown Source)
           at 
java.base/java.io.ObjectInputStream.defaultCheckFieldValues(Unknown Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream.defaultReadFields(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream.readObject(Unknown Source)
           at java.base/java.io.ObjectInputStream.readObject(Unknown Source)
           at 
scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527)
           at jdk.internal.reflect.GeneratedMethodAccessor2.invoke(Unknown 
Source)
           at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown 
Source)
           at java.base/java.lang.reflect.Method.invoke(Unknown Source)
           at java.base/java.io.ObjectStreamClass.invokeReadObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream.defaultReadFields(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream.defaultReadFields(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readSerialData(Unknown Source)
           at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown 
Source)
           at java.base/java.io.ObjectInputStream.readObject0(Unknown Source)
           at java.base/java.io.ObjectInputStream.readObject(Unknown Source)
           at java.base/java.io.ObjectInputStream.readObject(Unknown Source)
           at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
           at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
           at org.apache.spark.scheduler.Task.run(Task.scala:136)
           at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
           at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
           at java.base/java.lang.Thread.run(Unknown Source)
   
   Driver stacktrace:
   24/06/18 07:29:59 INFO DAGScheduler: Job 0 failed: save at 
SinkExecuteProcessor.java:162, took 0.812955 s
   24/06/18 07:29:59 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7) on 
10.16.59.102, executor 1: java.lang.ClassCastException (cannot assign instance 
of scala.collection.immutable.List$SerializationProxy to field 
org.apache.spark.rdd.RDD.dependencies_ of type scala.collection.Seq in instance 
of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 7]
   24/06/18 07:29:59 ERROR AppendDataExec: Data source write support 
org.apache.seatunnel.translation.spark.sink.SeaTunnelBatchWrite@64514009 is 
aborting.
   24/06/18 07:29:59 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks 
have all completed, from pool 
   24/06/18 07:29:59 ERROR AppendDataExec: Data source write support 
org.apache.seatunnel.translation.spark.sink.SeaTunnelBatchWrite@64514009 
aborted.
   24/06/18 07:29:59 ERROR SeaTunnel:
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   ### spark
   3.3.0
   ### seatunnel
   2.3.5
   
   
   
   ### Java or Scala Version
   
   ### spark -scala
   Scala 2.12.15
   ### java
   openjdk 11.0.13 
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to