devbiu opened a new issue, #4980:
URL: https://github.com/apache/seatunnel/issues/4980

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   Synchronize Kafka to MySQL using Flink 
   the same data as the topics of coffee
   
   [Id":"10", "name":"test_admin", "sex":"man", "age":"26", 
"raw_message":"sun#wuk"]
   
   [Id":"11", "name":"test_admin", "sex":"man", "age":"28", 
"raw_message":"tang#sz"]
   
   {id":"12":"test_admin","sex":"man":"age":"28", "raw_message":"zhu#bj"}
   
   [Id":"14", "test_admin","sex":"man","age":"28", "raw_message":"bai,gj"]
   
   Because there is no good split source_field, so we try raw_message with the 
nickname of spark
   
   The separator used is # humour, also includes, split
   
   The sinkWe use is mysql, its structure has id, name, sex, age, raw_message 
and breaking parameters first_name, last_name
   
   and console for printing
   
   However, the result does not contain first_name to split raw_message, 
last_name has always been aired, the output of console does not contain 
first_name, last_name
   
   The other paragraphs were also accompanied in the mysql table.
   
   ### SeaTunnel Version
   
   seatunnel 2.3.0
   
   ### SeaTunnel Config
   
   ```conf
   hocon like 
   env {
     job.name = "seatunnel_split_8dlUQM7J"
     
     job.mode = "STREAMING"
   }
   
   source {
     Kafka {
       bootstrap.servers = ""
       
       commit_on_checkpoint = "true"
       
       format = "json"
       
       pattern = "false"
       
       schema = {
         fields {
           id = "int"
           
           name = "string"
           
           sex = "string"
           
           age = "string"
           
           raw_message = "string"
         }
       }
       
       start_mode = "earliest"
       
       topic = "demo02"
     }
   }
   
   transform {
     split {
       fields = ["first_name", "last_name"]
       
       separator = "#"
     }
   }
   
   sink {
     Jdbc {
       batch_interval_ms = 1000
       
       batch_size = 300
       
       connection_check_timeout_sec = 30
       
       driver = "com.mysql.cj.jdbc.Driver"
       
       is_exactly_once = "false"
       
       max_commit_attempts = 3
       
       max_retries = 3
       
       password = ""
       
       support_upsert_by_query_primary_key_exist = "false"
       
       table = "xxxx"
       
       transaction_timeout_sec = -1
       
       url = ""
       
       user = ""
     }
     
     Console {
       
     }
   }
   ```
   
   
   ### Running Command
   
   ```shell
   /opt/seatunnel/seatunnel/bin/start-seatunnel-flink-connector-v2.sh  -D 
env.java.opts="-Dfile.encoding=UTF-8 -Dsun.jnu.encoding=UTF-8" --config xxx.conf
   ```
   
   
   ### Error Exception
   
   ```log
   not exception
   But the field value output by the data split is empty
   ```
   
   
   ### Flink or Spark Version
   
   flink 1.13.6 
   
   ### Java or Scala Version
   
   java 11
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to