Hi Community,
I am reading data from Kafka. The FlinkKafkaConsumer reads data from it.
Then some application-specific logic in a process function. If I receive
any invalid data I throw a custom exception and it's handled in the process
function itself. This invalid data is taken out as side output. But the
problem is Flink tries to read the same invalid messages again and again
for a few times.

Can anyone let me know how can the error/exception handling be done without
the Flink job breaking?

The plan is to process all the events only once through the process
function without any retry.

Regards
Anil

Reply via email to