[
https://issues.apache.org/jira/browse/FLINK-17091?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17105302#comment-17105302
]
Paul Lin commented on FLINK-17091:
----------------------------------
[~lzljs3620320] Thanks a lot for the information!
[~dwysakowicz] Thanks a lot for the very detailed explanation! Now I
understand your point. I think with the new Blink planner Timestamp will always
be bridged to LocalDateTime (correct me if I'm wrong), and we keep
java.sql.Timestamp just for compatibility with the old planner. When fully
migrated to the new planner, we can safely clean up the codes for
java.sql.Timestamp.
> AvroRow(De)SerializationSchema doesn't support new Timestamp conversion
> classes
> -------------------------------------------------------------------------------
>
> Key: FLINK-17091
> URL: https://issues.apache.org/jira/browse/FLINK-17091
> Project: Flink
> Issue Type: Bug
> Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
> Affects Versions: 1.10.0
> Reporter: Paul Lin
> Priority: Major
>
> AvroRow(De)SerializationSchema doesn't know how to convert the new physical
> classes of Timestamp (eg. java.time.Date) to/from Avro's int/long based
> timestamp. Currently, when encountering objects of the new physical classes,
> AvroRow(De)SerializationSchema just ignores them and passes them to Avro's
> GenericDatumWriter/Reader, which leads to ClassCastException thrown by
> GenericDatumWriter/Reader. See
> [AvroRowSerializationSchema|https://github.com/apache/flink/blob/master/flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowSerializationSchema.java#L251].
> To fix this problem, we should support LocalTime/LocalDate/LocalDateTime
> conversion to int/long in AvroRowSerializationSchema, and support int/long
> conversion to LocalTime/LocalDate/LocalDateTime based on logical
> types(Types.LOCAL_TIME/Types.LOCAL_DATE/Types.LOCAL_DATE_TIME) in
> AvroRowDeserializationSchema.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)