conorzhao opened a new issue, #11833: URL: https://github.com/apache/hudi/issues/11833
**_Tips before filing an issue_** - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? - Join the mailing list to engage in conversations and get faster support at [email protected]. - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly. **Describe the problem you faced** Flink cdc synchronizes mysql to write to the hudi table. When the newly added fields are synchronized to hudi, occasional query exceptions occur. The synchronization task is written normally, the metadata synchronization is normal, and the newly added fields can also be queried in the hudi table through the query schema. A clear and concise description of the problem. **To Reproduce** Steps to reproduce the behavior: 1. 2. 3. 4. **Expected behavior** A clear and concise description of what you expected to happen. **Environment Description** * Hudi version : 0.14.1 * Spark version :3.3.1 * Hive version :3.1.0 * Hadoop version :3.1.1 * Storage (HDFS/S3/GCS..) :hdfs * Running on Docker? (yes/no) 🔕 **Additional context** Add any other context about the problem here. **Stacktrace** ```Add the stacktrace of the error.``` 24/08/27 15:12:23 WARN TasksetManager:Lost task 4.2 in stage 5.0 (TID 41)(172.41.4.87 executor 9): org.apache.hudi.exception.HoodieException:Exception when readinglog file at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalv1(AbstractHoodieLogRecordReader.java:414) at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:220) at org.apache.hudi.common.table.log.HoodieMergedLogRecordscanner.performScan(HoodieMergedLogRecordscanner.java:201) at org.apache.hudi.common.table.log.HoodieMergedLogRecordscanner.<init>(HoodieMergedLogRecordscanner.java:117) at org.apache.hudi.common.table.log.HoodieMergedLogRecordscanner.<init>(HoodieMergedLogRecordscanner.java:76) at org.apache.hudi.common.table.log.HoodieMergedLogRecordscanner$Builder.build(HoodieMergedLogRecordscanner.java:466) at org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:413) at org.apache.hudi.LogFileIterator.<init>(Iterators.scala:110) at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:234) at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:245) at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:250) at org.apache.hudi.HoodieMergeonReadRDD.compute(HoodieMergeOnReadRDD.scala:109) at org.apache.spark.rdd.RDD.computeorReadcheckpoint(RDD.scala:365) at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeorReadCheckpoint(RDD.scala:365) at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeorReadcheckpoint(RDD.scala:365) at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) at org.apache.spark.shuffle.ShufflewriteProcessor.write(shufflewriteProcessor.scala:59) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfunsrun$3(Executor.scala:548) at org.apache.spark.util.utils$.trywithsafeFinally(Utils.scala:1504) Caused by:org.apache.avro.AvroTypeException:Found hd_db_kelp_ap_apply_t record,expecting hd_db_kelp_ap_apply_test_record,missing required field is_plus_cust_flag at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:308)at org.apache.avro.io.parsing.Parser.advance(Parser.java:86) at org.apache.avro.io.ResolvingDecoder.readFieldorder(ResolvingDecoder.java:127)at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:240) at org.apache.avro.generic.GenericDatumReader.readwithoutConversion(GenericDatumReader.java:180)at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:154) at org.apache.hudi.common.table.log.block.HoodieAvroDataBlock$RecordIterator.next (HoodieAvroDataBlock.java:204) at org.apache.hudi.common.table.log.block.HoodieAvroDataBlock$RecordIterator.next(HoodieAvroDataBlock.java:149) at org.apache.hudi.common.util.collection.MappingIterator.next(MappingIterator.java:44)at org.apache.hudi.common.util.collection.MappingIterator.next(MappingIterator.java:44) at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processDataBlock(AbstractHoodieLogRecordReader.java:784) at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:825)at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalv1(AbstractHoodieLogRecordReader.java:403) ..29 more -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
