[ https://issues.apache.org/jira/browse/FLINK-33507 ]


    jinzhuguang deleted comment on FLINK-33507:
    -------------------------------------

was (Author: JIRAUSER302532):
Thank you for your reply. In my scenario, I just want to recognize zero date as 
null, but json.ignore-parse-errors will swallow all exceptions, which will 
cause unknown problems in actual production, so I hope Flink can provide a 
parameter similar to MySQL JDBC:
{code:java}
URL = "jdbc:mysql://****:3306/****?zeroDateTimeBehavior=convertToNull";{code}

> JsonToRowDataConverters can't parse zero timestamp  '0000-00-00 00:00:00'
> -------------------------------------------------------------------------
>
>                 Key: FLINK-33507
>                 URL: https://issues.apache.org/jira/browse/FLINK-33507
>             Project: Flink
>          Issue Type: Bug
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>    Affects Versions: 1.16.0
>         Environment: Flink 1.16.0
>            Reporter: jinzhuguang
>            Priority: Major
>              Labels: CDC, JsonFormatter, Kafka, MySQL
>   Original Estimate: 96h
>  Remaining Estimate: 96h
>
> When I use Flink CDC to synchronize data from MySQL, Kafka is used to store 
> data in JSON format. But when I read data from Kafka, I found that the 
> Timestamp type data "0000-00-00 00:00:00" in MySQL could not be parsed by 
> Flink, and the error was reported as follows:
> Caused by: 
> org.apache.flink.formats.json.JsonToRowDataConverters$JsonParseException: 
> Fail to deserialize at field: data.
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$createRowConverter$ef66fe9a$1(JsonToRowDataConverters.java:354)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$wrapIntoNullableConverter$de0b9253$1(JsonToRowDataConverters.java:380)
>     at 
> org.apache.flink.formats.json.JsonRowDataDeserializationSchema.convertToRowData(JsonRowDataDeserializationSchema.java:131)
>     at 
> org.apache.flink.formats.json.canal.CanalJsonDeserializationSchema.deserialize(CanalJsonDeserializationSchema.java:234)
>     ... 17 more
> Caused by: 
> org.apache.flink.formats.json.JsonToRowDataConverters$JsonParseException: 
> Fail to deserialize at field: update_time.
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$createRowConverter$ef66fe9a$1(JsonToRowDataConverters.java:354)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$wrapIntoNullableConverter$de0b9253$1(JsonToRowDataConverters.java:380)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$createArrayConverter$94141d67$1(JsonToRowDataConverters.java:304)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$wrapIntoNullableConverter$de0b9253$1(JsonToRowDataConverters.java:380)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.convertField(JsonToRowDataConverters.java:370)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$createRowConverter$ef66fe9a$1(JsonToRowDataConverters.java:350)
>     ... 20 more
> Caused by: java.time.format.DateTimeParseException: Text '0000-00-00 
> 00:00:00' could not be parsed: Invalid value for MonthOfYear (valid values 1 
> - 12): 0
>     at 
> java.time.format.DateTimeFormatter.createError(DateTimeFormatter.java:1920)
>     at java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1781)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.convertToTimestamp(JsonToRowDataConverters.java:224)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$wrapIntoNullableConverter$de0b9253$1(JsonToRowDataConverters.java:380)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.convertField(JsonToRowDataConverters.java:370)
>     at 
> org.apache.flink.formats.json.JsonToRowDataConverters.lambda$createRowConverter$ef66fe9a$1(JsonToRowDataConverters.java:350)
>     ... 25 more
> Caused by: java.time.DateTimeException: Invalid value for MonthOfYear (valid 
> values 1 - 12): 0
>     at java.time.temporal.ValueRange.checkValidIntValue(ValueRange.java:330)
>     at java.time.temporal.ChronoField.checkValidIntValue(ChronoField.java:722)
>     at java.time.chrono.IsoChronology.resolveYMD(IsoChronology.java:550)
>     at java.time.chrono.IsoChronology.resolveYMD(IsoChronology.java:123)
>     at 
> java.time.chrono.AbstractChronology.resolveDate(AbstractChronology.java:472)
>     at java.time.chrono.IsoChronology.resolveDate(IsoChronology.java:492)
>     at java.time.chrono.IsoChronology.resolveDate(IsoChronology.java:123)
>     at java.time.format.Parsed.resolveDateFields(Parsed.java:351)
>     at java.time.format.Parsed.resolveFields(Parsed.java:257)
>     at java.time.format.Parsed.resolve(Parsed.java:244)
>     at 
> java.time.format.DateTimeParseContext.toResolved(DateTimeParseContext.java:331)
>     at 
> java.time.format.DateTimeFormatter.parseResolved0(DateTimeFormatter.java:1955)
>     at java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1777)
>     ... 29 more
> Usually MySQL allows the server and client to parse this type of data and 
> treat it as NULL, so I think Flink should also support it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to