[ 
https://issues.apache.org/jira/browse/FLINK-20234?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jark Wu updated FLINK-20234:
----------------------------
    Component/s: Table SQL / Ecosystem

> Json format supports SE/DE null elements of ARRAY type field
> ------------------------------------------------------------
>
>                 Key: FLINK-20234
>                 URL: https://issues.apache.org/jira/browse/FLINK-20234
>             Project: Flink
>          Issue Type: Improvement
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile), Table 
> SQL / Ecosystem
>    Affects Versions: 1.11.2
>            Reporter: Danny Chen
>            Priority: Major
>             Fix For: 1.12.0
>
>
> Report my USER mailing list:
> Hi,
> I recently discovered some of our data has NULL values arriving in an 
> ARRAY<STRING> column. This column is being consumed by Flink via the Kafka 
> connector Debezium format. We seem to be receiving NullPointerExceptions for 
> when these NULL values in the arrays arrive which restarts the source 
> operator in a loop.
> Is there any way to not throw or to possibly filter out NULLs in an Array of 
> Strings in Flink?
> We're somewhat stuck on how to solve this problem, we'd like to be defensive 
> about this on Flink's side.
> Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to