[ 
https://issues.apache.org/jira/browse/NIFI-14545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17954939#comment-17954939
 ] 

Paul Grey commented on NIFI-14545:
----------------------------------

Hi.

There is a corresponding Kafka-side limit that needs to be considered here.  
The Kafka service limits the size of individual incoming messages as a 
protection mechanism.  Clients (like NiFi/PublishKafka) may set a corresponding 
client-side limit that saves a network roundtrip that would be expected to 
fail.  These two limits should be kept in sync, based on the needs of the use 
case.

 

[https://stackoverflow.com/questions/42507494/org-apache-kafka-common-errors-recordtoolargeexception-in-flume-kafka-sink/42507669#42507669]

https://kafka.apache.org/20/javadoc/org/apache/kafka/common/errors/RecordTooLargeException.html

> Kafka Publisher fails when content is larger than 1MB after V 2.4.0
> -------------------------------------------------------------------
>
>                 Key: NIFI-14545
>                 URL: https://issues.apache.org/jira/browse/NIFI-14545
>             Project: Apache NiFi
>          Issue Type: Bug
>    Affects Versions: 2.4.0
>            Reporter: Jordan Sammut
>            Assignee: Jordan Sammut
>            Priority: Blocker
>             Fix For: 2.5.0
>
>         Attachments: image-2025-05-08-01-03-42-572.png, 
> image-2025-05-29-12-55-26-786.png
>
>          Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> After Release 2.4.0, there stills seems to be an issue where a Kafka 
> Publisher fails when attempting to Publish content that is larger than 1MB



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to