Re:Re: Error handler strategy in Flink Kafka connector with json format
yep. Glad to see the progress. Best At 2020-03-09 12:44:05, "Jingsong Li" wrote: Hi Sunfulin, I think this is very important too. There is an issue to fix this[1]. Is that meet your requirement? [1] https://issues.apache.org/jira/browse/FLINK-15396 Best, Jingsong Lee On Mon, Mar 9, 2020 at 12:33 PM sunfulin wrote: hi , community, I am wondering if there is some config params with error handler strategy as [1] refers when defining a Kafka stream table using Flink SQL DDL. For example, the following `json.parser.failure.strategy' can be set to `silencly skip` that can skip the malformed dirty data process while consuming kafka records. create table xxx ( ... ) with ( 'connector.type' = 'kafka', 'format.type' = 'json', 'json.parser.failure.strategy' = 'silencly skip' ) [1] https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues/ -- Best, Jingsong Lee
Re: Error handler strategy in Flink Kafka connector with json format
Hi Sunfulin, I think this is very important too. There is an issue to fix this[1]. Is that meet your requirement? [1] https://issues.apache.org/jira/browse/FLINK-15396 Best, Jingsong Lee On Mon, Mar 9, 2020 at 12:33 PM sunfulin wrote: > hi , community, > I am wondering if there is some config params with error handler strategy > as [1] refers when defining a Kafka stream table using Flink SQL DDL. For > example, the following `json.parser.failure.strategy' can be set to > `silencly skip` that can skip the malformed dirty data process while > consuming kafka records. > > create table xxx ( > ... > ) with ( > 'connector.type' = 'kafka', > 'format.type' = 'json', > 'json.parser.failure.strategy' = 'silencly skip' > ) > [1] > https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues/ > > > > -- Best, Jingsong Lee
Error handler strategy in Flink Kafka connector with json format
hi , community, I am wondering if there is some config params with error handler strategy as [1] refers when defining a Kafka stream table using Flink SQL DDL. For example, the following `json.parser.failure.strategy' can be set to `silencly skip` that can skip the malformed dirty data process while consuming kafka records. create table xxx ( ... ) with ( 'connector.type' = 'kafka', 'format.type' = 'json', 'json.parser.failure.strategy' = 'silencly skip' ) [1] https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues/