stayrascal opened a new issue #4219:
URL: https://github.com/apache/hudi/issues/4219


   **_Tips before filing an issue_**
   
   - Have you gone through our 
[FAQs](https://cwiki.apache.org/confluence/display/HUDI/FAQ)?
      No
   - Join the mailing list to engage in conversations and get faster support at 
[email protected].
   
   - If you have triaged this as a bug, then file an 
[issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   A clear and concise description of the problem.
   I'm trying to spike the capture data changes to Hudi table via Flink CDC, 
and then I build the hudi by modify 1.13.1 to 1.13.3, and then meet a 
NoSuchMethodError issue when trying to update a filed in MySQL, but we treat 
this filed as a partition key in Hudi table. 
   
   I might be a shaded package issue or kyro issue during set the instantiator 
strategy, but org.objenesis.strategy.InstantiatorStrategy not found in the 
package
   
![image](https://user-images.githubusercontent.com/13775334/144715253-d02dc963-4fc6-44aa-a69f-88b3995df993.png)
   
![image](https://user-images.githubusercontent.com/13775334/144715346-326c7ce2-dd7b-41d8-8eb4-0415ccea69c2.png)
   
   
   Hudi version: master branch with commint 
bed7f9897a9127130c3d241df7634d44aa12167b
   Flink version 1.13.3
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. rebuild hudi by changing flink version from 1.13.1 to 1.13.3
   2. follow by this documentation 
https://ververica.github.io/flink-cdc-connectors/master/content/quickstart/mysql-postgres-tutorial.html
   3. Create target hudi table via Flink SQL 
   ```
   CREATE TABLE orders_hudi (
      order_id INT,
      order_date TIMESTAMP(0),
      customer_name STRING,
      price DECIMAL(10, 5),
      product_id INT,
      order_status BOOLEAN,
      PRIMARY KEY (order_id) NOT ENFORCED
    ) 
        PARTITIONED BY (product_id)
    WITH (
      'connector' = 'hudi',
      'path' = 'file://xxxxx/flink-1.13.3/data/hudi/orders_hudi_new',
      'table.type' = 'MERGE_ON_READ',
      'read.streaming.enabled' = 'true',
     'read.streaming.start-commit' = '19700101000005',
     'read.streaming.check-interval' = '4'
    );
   ```
   4. trying to update the `product_id` in MySQL
   
   **Expected behavior**
   should not exception happen, and the updated changed will be sync to hudi 
table, and even through some events failed, it should not block the next 
database events.
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   Local
   * Hudi version :
   0.11.0 master branch with commint bed7f9897a9127130c3d241df7634d44aa12167b
   * Spark version :
   
   * Hive version :
   
   * Hadoop version :
   
   * Storage (HDFS/S3/GCS..) :
   Local
   * Running on Docker? (yes/no) :
   no
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```Add the stacktrace of the error.```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to