HuangFru commented on issue #7325:
URL: https://github.com/apache/hudi/issues/7325#issuecomment-1333251501

   I think I've solved this question, when I create tables by Flink, I add this 
in conf in the hudi Flink Writer:
   ```
   conf.setString(FlinkOptions.PRECOMBINE_FIELD, FlinkOptions.NO_PRE_COMBINE);
   ```
   But I didn't add this conf when creating table using hudi catalog.After I 
add this when creating tables, then Spark can read the tables correctly.
   I'm still confused about this. If I don't set this conf to NO_PRE_COMBINE), 
Flink will create the table with the pre-combine field 'ts', then spark read 
the table and will meet this error. I also tried to read by Presto and it works 
fine in Presto. It seems that only Spark will meet this error.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to