dawidwys commented on a change in pull request #13373:
URL: https://github.com/apache/flink/pull/13373#discussion_r487208716
##########
File path:
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -272,11 +276,14 @@ private static DataType convertToDataType(Schema schema) {
return DataTypes.TIMESTAMP(3)
.bridgedTo(java.sql.Timestamp.class)
.notNull();
- }
- if (schema.getLogicalType() ==
LogicalTypes.timestampMicros()) {
+ } else if (schema.getLogicalType() ==
LogicalTypes.timestampMicros()) {
return DataTypes.TIMESTAMP(6)
.bridgedTo(java.sql.Timestamp.class)
.notNull();
+ } else if (schema.getLogicalType() ==
LogicalTypes.timeMicros()) {
+ return DataTypes.TIME(6)
+ .bridgedTo(LocalTime.class)
Review comment:
Actually, those should be the default conversion classes from
`java.time`. I changed that. It is not used, at least for now. We might need
that later e.g. in schema-registry where we will need to convert schema
retrieved from schema-registry to a SQL schema.
It was added as a complementary method to
`org.apache.flink.formats.avro.typeutils.AvroSchemaConverter#convertToSchema(org.apache.flink.table.types.logical.LogicalType)`
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]