raheen1 opened a new issue, #6319:
URL: https://github.com/apache/seatunnel/issues/6319

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   I am trying to run a job using spark, using the quick-start-spark 
[documentation](https://seatunnel.apache.org/docs/start-v2/locally/quick-start-spark/)
 
   
   Versions I am using:
   Apache Spark version: 3.5.0
   Apache Seatunnel: 2.3.3
   
   When I run a configuration using BATCH or STREAMING as the job mode, the 
output only displays the column names and not the actual data and throws an 
exception as shown below:
   
   ```
   Output: name#0, age#1
            
   24/01/31 00:35:32 INFO CodeGenerator: Code generated in 342.806808 ms
   Exception in thread "main" java.lang.IncompatibleClassChangeError: 
Conflicting default methods: 
org/apache/spark/sql/connector/write/BatchWrite.useCommitCoordinator 
org/apache/spark/sql/connector/write/streaming/StreamingWrite.useCommitCoordinator
   ```
   
   I believe the output should include all the rows of data and not just the 
column names.
   
   Here is my config file:
   ```
   {
       "env" : {
           "execution.parallelism" : 2,
           "job.mode" : "BATCH",
           "checkpoint.interval" : 10000
       },
       "source" : [
           {
               "schema" : {
                   "fields" : {
                       "name" : "string",
                       "age" : "int"
                   }
               },
               "row.num" : 16,
               "parallelism" : 2,
               "result_table_name" : "fake",
               "plugin_name" : "FakeSource"
           }
       ],
       "sink" : [
           {
               "plugin_name" : "Console"
           }
       ]
   }
   ```
   
   I get the same exception even after changing the job mode. 
   
   I have two questions:
   * Does the above output mean the job has successfully run?
   * How do I fix this issue?
   
   
   ### SeaTunnel Version
   
   2.3.3
   
   ### SeaTunnel Config
   
   ```conf
   {
       "env" : {
           "execution.parallelism" : 2,
           "job.mode" : "BATCH",
           "checkpoint.interval" : 10000
       },
       "source" : [
           {
               "schema" : {
                   "fields" : {
                       "name" : "string",
                       "age" : "int"
                   }
               },
               "row.num" : 16,
               "parallelism" : 2,
               "result_table_name" : "fake",
               "plugin_name" : "FakeSource"
           }
       ],
       "sink" : [
           {
               "plugin_name" : "Console"
           }
       ]
   }
   ```
   ```
   
   
   ### Running Command
   
   ```shell
   ./start-seatunnel-spark-3-connector-v2.sh --master local[4] --deploy-mode 
client --config /root/apache-seatunnel-2.3.4/config/v2.batch.config.template
   ```
   
   
   ### Error Exception
   
   ```log
   Exception in thread "main" java.lang.IncompatibleClassChangeError: 
Conflicting default methods: 
org/apache/spark/sql/connector/write/BatchWrite.useCommitCoordinator 
org/apache/spark/sql/connector/write/streaming/StreamingWrite.useCommitCoordinator
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   Spark version: 3.5.0
   
   ### Java or Scala Version
   
   8
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to