[
https://issues.apache.org/jira/browse/FLINK-18414?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17142638#comment-17142638
]
Jark Wu commented on FLINK-18414:
---------------------------------
We added a new option {{json.ignore-parse-error}} in 1.11, is that what you are
looking for?
https://ci.apache.org/projects/flink/flink-docs-master/dev/table/connectors/formats/json.html#json-ignore-parse-errors
> Kafka Json connector in Table API support more option
> -----------------------------------------------------
>
> Key: FLINK-18414
> URL: https://issues.apache.org/jira/browse/FLINK-18414
> Project: Flink
> Issue Type: Improvement
> Components: Connectors / Kafka, Formats (JSON, Avro, Parquet, ORC,
> SequenceFile), Table SQL / Ecosystem
> Affects Versions: 1.10.1
> Reporter: DuBin
> Priority: Major
>
> Currently, the Flink use a
> 'org.apache.flink.formats.json.JsonRowDeserializationSchema' to deserialize
> the record into Row if we define the Kafka Json Table Source.
> But the parser is hard-coded in the class :
> private final ObjectMapper objectMapper = new ObjectMapper();
> Imagine that the Json data source contains data like this:
> {"a":NaN,"b":1.2}
> or it contains some dirty data, it will throw exception in the deserialize
> function all the time, because Kafka do not have a schema validation on Json
> format.
>
> So can we add more options in the
> 'org.apache.flink.formats.json.JsonRowFormatFactory' , in the
> 'org.apache.flink.formats.json.JsonRowFormatFactory#createDeserializationSchema'?
> e.g. add more option for the objectMapper, some dirty data handler(just
> return an empty row, defined by the user)
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)