[ 
https://issues.apache.org/jira/browse/FLINK-38819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18046264#comment-18046264
 ] 

Timo Walther edited comment on FLINK-38819 at 12/18/25 11:53 AM:
-----------------------------------------------------------------

Hi [~sverma], the link you posted is not accessible. In general logging errors 
is tricky in a streaming system because it quickly would fill up local disks 
and log system if the schema is malformed and millions of parsing errors would 
need to be logged. This is the reason why most connectors, formats, and runtime 
operators don't log data. Also for confidential reasons data is sensitive in 
some orgs and should not be logged. 


was (Author: twalthr):
Hi [~sverma], the link you posted is not accessible. In general logging errors 
is tricky in a streaming system because it quickly would fill up local disks 
and log system if the schema is malformed. This is the reason why most 
connectors, formats, and runtime operators don't log data. Also for 
confidential reasons data is sensitive in some orgs and should not be logged. 

>  Make Flink JSON deserialization schema log parsing errors
> ----------------------------------------------------------
>
>                 Key: FLINK-38819
>                 URL: https://issues.apache.org/jira/browse/FLINK-38819
>             Project: Flink
>          Issue Type: Improvement
>            Reporter: Santwana Verma
>            Assignee: Santwana Verma
>            Priority: Major
>
> When reading json format from a Kafka topic and 
> `json.ignore-parse-errors=true` the deserialization errors are not logged. If 
> this is set to false, it results in a poison pill where it repeatedly fails. 
> The [code|https://confluentinc.atlassian.net/browse/CF-2099] confirms that 
> the errors are simply ignored. This ticket proposes having the errors logged.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to