xuebotao opened a new issue, #5127: URL: https://github.com/apache/seatunnel/issues/5127
### Search before asking - [X] I had searched in the [issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues. ### What happened [2023-07-20 11:09:48.625][org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:62)][main][main][][org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand][Run SeaTunnel on spark failed.][][][Ex] java.lang.UnsupportedOperationException: org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write at org.apache.spark.sql.connector.write.WriteBuilder.buildForBatch(WriteBuilder.java:44) at org.apache.spark.sql.execution.datasources.v2.AppendDataExec.run(WriteToDataSourceV2Exec.scala:259) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:39) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:39) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.doExecute(V2CommandExec.scala:54) at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:175) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:213) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:210) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:171) at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:122) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:121) at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:963) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:769) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:963) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:354) at org.apache.seatunnel.core.starter.spark.execution.SinkExecuteProcessor.execute(SinkExecuteProcessor.java:144) at org.apache.seatunnel.core.starter.spark.execution.SparkExecution.execute(SparkExecution.java:74) at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:60) at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:61)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][ =============================================================================== ][][][Ex] [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:64)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Fatal Error, ][][][Ex] [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:66)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Please submit bug report in https://github.com/apache/incubator-seatunnel/issues ][][][Ex] [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:68)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Reason:org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write ][][][Ex] [2023-07-20 11:09:48.629][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:69)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Exception StackTrace:org.apache.seatunnel.core.starter.exception.CommandExecuteException: org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:63) at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ][][][Ex] [2023-07-20 11:09:48.629][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:70)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][ =============================================================================== ][][][Ex] Exception in thread "main" org.apache.seatunnel.core.starter.exception.CommandExecuteException: org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:63) at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) ### SeaTunnel Version 2.3.1 ### SeaTunnel Config ```conf "env" : { "checkpoint.interval" : 10000, "execution.parallelism" : 1 }, "sink" : [ { "password" : "xx", "fenodes" : "xx", "sink.enable-2pc" : "true", "doris.config" : { "format" : "json", "read_json_by_line" : "true" }, "query" : "insert into xx(event) values(?)", "table.identifier" : "xx.xx", "source_table_name" : "xx", "plugin_name" : "Doris", "sink.label-prefix" : "test_json", "username" : "xx" } ], "source" : [ { "catalog_name" : "ue1", "catalog_type" : "hive", "namespace" : "default", "result_table_name" : "student_seatunnel", "warehouse" : "xx", "plugin_name" : "Iceberg", "uri" : "xxx", "table" : "student_seatunnel" } ], "transform" : [ { "result_table_name" : "ods_ds_user_behavior_test", "source_table_name" : "student_seatunnel", "fields" : [ "name" ], "plugin_name" : "Filter" } ] } ``` ### Running Command ```shell spark submint ``` ### Error Exception ```log [2023-07-20 11:09:48.625][org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:62)][main][main][][org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand][Run SeaTunnel on spark failed.][][][Ex] java.lang.UnsupportedOperationException: org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write at org.apache.spark.sql.connector.write.WriteBuilder.buildForBatch(WriteBuilder.java:44) at org.apache.spark.sql.execution.datasources.v2.AppendDataExec.run(WriteToDataSourceV2Exec.scala:259) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:39) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:39) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.doExecute(V2CommandExec.scala:54) at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:175) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:213) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:210) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:171) at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:122) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:121) at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:963) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:769) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:963) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:354) at org.apache.seatunnel.core.starter.spark.execution.SinkExecuteProcessor.execute(SinkExecuteProcessor.java:144) at org.apache.seatunnel.core.starter.spark.execution.SparkExecution.execute(SparkExecution.java:74) at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:60) at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:61)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][ =============================================================================== ][][][Ex] [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:64)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Fatal Error, ][][][Ex] [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:66)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Please submit bug report in https://github.com/apache/incubator-seatunnel/issues ][][][Ex] [2023-07-20 11:09:48.628][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:68)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Reason:org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write ][][][Ex] [2023-07-20 11:09:48.629][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:69)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][Exception StackTrace:org.apache.seatunnel.core.starter.exception.CommandExecuteException: org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:63) at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ][][][Ex] [2023-07-20 11:09:48.629][org.apache.seatunnel.core.starter.SeaTunnel.showFatalError(SeaTunnel.java:70)][main][main][][org.apache.seatunnel.core.starter.SeaTunnel][ =============================================================================== ][][][Ex] Exception in thread "main" org.apache.seatunnel.core.starter.exception.CommandExecuteException: org.apache.seatunnel.translation.spark.sink.write.SeaTunnelWriteBuilder does not support batch write at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:63) at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) ``` ### Flink or Spark Version spark 3 ### Java or Scala Version 1.8 ### Screenshots _No response_ ### Are you willing to submit PR? - [X] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
