sgirabin opened a new issue, #5317:
URL: https://github.com/apache/seatunnel/issues/5317

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   **Background**
   
   SeaTunnel v2.3.2 Documentation about Transform Common Options: 
   
   | name | type | required | default value | 
   | -- | -- | -- | -- | 
   | result_table_name | string | no | - | 
   | source_table_name | string | no | - | 
   
   Reference: 
https://seatunnel.apache.org/docs/2.3.2/transform-v2/common-options
   
   **Issue**
    Create config that will use FieldMapper from Transform v2 without 
specifying `result_table_name` and `result_table_name`, example
   
   ```
   env {
     # You can set SeaTunnel environment configuration here
     execution.parallelism = 2
     job.mode = "BATCH"
     checkpoint.interval = 10000
     #execution.checkpoint.interval = 10000
     #execution.checkpoint.data-uri = "hdfs://localhost:9000/checkpoint"
   }
   
   source {
     HdfsFile {
       delimiter = ","
       schema {
          fields {
            request_time = "Timestamp"
            event_type = "String"
            request_id = "String"
          }
       }
       path = "/user/xyz/warehouse/xyz.db/click_log/dt=2023-08-01/hh=20/"
       file_format_type = "csv"
       fs.defaultFS = "hdfs://namenode1:8020"
     }
   }
   
   transform {
       FieldMapper {
         field_mapper = {
           request_time              = ts
           event_type                = etype
           request_id                = request_id
         }
       }
   }
   
   sink {
     Console {
     }
   }
   ```
    
   Run the seatunnel with above config file, it will throw exception as follow
   
   ```
   23/08/16 15:50:21 ERROR SparkTaskExecuteCommand: Run SeaTunnel on spark 
failed.
   java.lang.IllegalArgumentException: The configuration missing key: 
source_table_name
        at 
org.apache.seatunnel.transform.common.AbstractSeaTunnelTransform.prepare(AbstractSeaTunnelTransform.java:46)
        at 
org.apache.seatunnel.core.starter.spark.execution.TransformExecuteProcessor.lambda$initializePlugins$0(TransformExecuteProcessor.java:84)
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
        at 
org.apache.seatunnel.core.starter.spark.execution.TransformExecuteProcessor.initializePlugins(TransformExecuteProcessor.java:89)
        at 
org.apache.seatunnel.core.starter.spark.execution.SparkAbstractPluginExecuteProcessor.<init>(SparkAbstractPluginExecuteProcessor.java:49)
        at 
org.apache.seatunnel.core.starter.spark.execution.TransformExecuteProcessor.<init>(TransformExecuteProcessor.java:61)
        at 
org.apache.seatunnel.core.starter.spark.execution.SparkExecution.<init>(SparkExecution.java:62)
        at 
org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:59)
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
        at 
org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   23/08/16 15:50:21 ERROR SeaTunnel:
   
   
===============================================================================
   
   
   23/08/16 15:50:21 ERROR SeaTunnel: Fatal Error,
   
   23/08/16 15:50:21 ERROR SeaTunnel: Please submit bug report in 
https://github.com/apache/seatunnel/issues
   
   23/08/16 15:50:21 ERROR SeaTunnel: Reason:The configuration missing key: 
source_table_name
   ```
   
   **Expectation**
   
   # `result_table_name` and `result_table_name` are an optional parameters, 
thus it is not needed to be declared inside the config file
       
   
   **Workaround**
   
   Declaring `result_table_name` and `result_table_name` with an empty value.
   
   
   ### SeaTunnel Version
   
   2.3.2
   
   ### SeaTunnel Config
   
   ```conf
   env {
     # You can set SeaTunnel environment configuration here
     execution.parallelism = 2
     job.mode = "BATCH"
     checkpoint.interval = 10000
     #execution.checkpoint.interval = 10000
     #execution.checkpoint.data-uri = "hdfs://localhost:9000/checkpoint"
   }
   
   source {
     HdfsFile {
       delimiter = ","
       schema {
          fields {
            request_time = "Timestamp"
            event_type = "String"
            request_id = "String"
          }
       }
       path = "/user/xyz/warehouse/xyz.db/click_log/dt=2023-08-01/hh=20/"
       file_format_type = "csv"
       fs.defaultFS = "hdfs://namenode1:8020"
     }
   }
   
   transform {
       FieldMapper {
         field_mapper = {
           request_time              = ts
           event_type                = etype
           request_id                = request_id
         }
       }
   }
   
   sink {
     Console {
     }
   }
   ```
   
   
   ### Running Command
   
   ```shell
   /bin/bash bin/start-seatunnel-spark-3-connector-v2.sh --config 
config/click.conf --master "local[4]" --deploy-mode client
   ```
   
   
   ### Error Exception
   
   ```log
   23/08/16 15:50:21 ERROR SparkTaskExecuteCommand: Run SeaTunnel on spark 
failed.
   java.lang.IllegalArgumentException: The configuration missing key: 
source_table_name
        at 
org.apache.seatunnel.transform.common.AbstractSeaTunnelTransform.prepare(AbstractSeaTunnelTransform.java:46)
        at 
org.apache.seatunnel.core.starter.spark.execution.TransformExecuteProcessor.lambda$initializePlugins$0(TransformExecuteProcessor.java:84)
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
        at 
org.apache.seatunnel.core.starter.spark.execution.TransformExecuteProcessor.initializePlugins(TransformExecuteProcessor.java:89)
        at 
org.apache.seatunnel.core.starter.spark.execution.SparkAbstractPluginExecuteProcessor.<init>(SparkAbstractPluginExecuteProcessor.java:49)
        at 
org.apache.seatunnel.core.starter.spark.execution.TransformExecuteProcessor.<init>(TransformExecuteProcessor.java:61)
        at 
org.apache.seatunnel.core.starter.spark.execution.SparkExecution.<init>(SparkExecution.java:62)
        at 
org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:59)
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
        at 
org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   23/08/16 15:50:21 ERROR SeaTunnel:
   
   
===============================================================================
   
   
   23/08/16 15:50:21 ERROR SeaTunnel: Fatal Error,
   
   23/08/16 15:50:21 ERROR SeaTunnel: Please submit bug report in 
https://github.com/apache/seatunnel/issues
   
   23/08/16 15:50:21 ERROR SeaTunnel: Reason:The configuration missing key: 
source_table_name
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   _No response_
   
   ### Java or Scala Version
   
   _No response_
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to