voonhous commented on code in PR #6915:
URL: https://github.com/apache/hudi/pull/6915#discussion_r1037771871
##########
hudi-flink-datasource/hudi-flink/src/main/java/org/apache/hudi/table/HoodieTableFactory.java:
##########
@@ -120,6 +122,7 @@ public Set<ConfigOption<?>> optionalOptions() {
*/
private void sanityCheck(Configuration conf, ResolvedSchema schema) {
List<String> fields = schema.getColumnNames();
+ Schema inferredSchema =
AvroSchemaConverter.convertToSchema(schema.toPhysicalRowDataType().notNull().getLogicalType());
Review Comment:
Sure, I will take a look.
The main reasons for doing this is:
1. `AvroSchemaConverter.convertToSchema` was already imported and used
somewhere else in the code so, just reuse
2. Convert it to an AvroSchema so that the helper functions can be written
in HoodieAvroUtils, where the validation for creation of tables using the Spark
as entrypoint can be reused.
Let me try to see if we can use the `ResolvedSchema` instead, will get back
to you.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]