[ 
https://issues.apache.org/jira/browse/KAFKA-9461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nicolas Guyomar updated KAFKA-9461:
-----------------------------------
    Description: 
Hi,

It is possible with the current implementation that we log a full record 
content at DEBUG level, which can overwhelmed log4j buffer and OOM it : 

That stack trace was due to a 70MB messages refused by a broker 
{code:java}
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
at java.lang.StringBuffer.append(StringBuffer.java:270)
at 
org.apache.log4j.helpers.PatternParser$LiteralPatternConverter.format(PatternParser.java:419)
at org.apache.log4j.PatternLayout.format(PatternLayout.java:506)
at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310)
at org.apache.log4j.RollingFileAppender.subAppend(RollingFileAppender.java:276)
at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
at 
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
at org.apache.log4j.Category.callAppenders(Category.java:206)
at org.apache.log4j.Category.forcedLog(Category.java:391)
at org.apache.log4j.Category.log(Category.java:856)
at org.slf4j.impl.Log4jLoggerAdapter.debug(Log4jLoggerAdapter.java:252)
at 
org.apache.kafka.connect.runtime.WorkerSourceTask$1.onCompletion(WorkerSourceTask.java:330){code}
  in  
[https://github.com/apache/kafka/blob/da4337271ef0b72643c0cf47ae51e69f79cf1901/connect/runtime/src/main/java/org/apache/kafka/connect/runtime/WorkerSourceTask.java#L348]

Would it make sense to protect Connect directly in the ConnectRecord toString() 
method and set a configurable limit ? 

 Thank you

 

 

  was:
Hi,

It is possible with the current implementation that we log a full record 
content at DEBUG level, which can overwhelmed log4j buffer and OOM it : 

That stack trace was due to a 70MB messages refused by a broker

 
{code:java}
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
at java.lang.StringBuffer.append(StringBuffer.java:270)
at 
org.apache.log4j.helpers.PatternParser$LiteralPatternConverter.format(PatternParser.java:419)
at org.apache.log4j.PatternLayout.format(PatternLayout.java:506)
at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310)
at org.apache.log4j.RollingFileAppender.subAppend(RollingFileAppender.java:276)
at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
at 
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
at org.apache.log4j.Category.callAppenders(Category.java:206)
at org.apache.log4j.Category.forcedLog(Category.java:391)
at org.apache.log4j.Category.log(Category.java:856)
at org.slf4j.impl.Log4jLoggerAdapter.debug(Log4jLoggerAdapter.java:252)
at 
org.apache.kafka.connect.runtime.WorkerSourceTask$1.onCompletion(WorkerSourceTask.java:330){code}
 

 

Would it make sense to protect Connect directly in the ConnectRecord toString() 
method and set a configurable limit ? 

 

Thank you

 

 

 
[https://github.com/apache/kafka/blob/da4337271ef0b72643c0cf47ae51e69f79cf1901/connect/runtime/src/main/java/org/apache/kafka/connect/runtime/WorkerSourceTask.java#L348]

 


> Limit DEBUG statement size when logging failed record value
> -----------------------------------------------------------
>
>                 Key: KAFKA-9461
>                 URL: https://issues.apache.org/jira/browse/KAFKA-9461
>             Project: Kafka
>          Issue Type: Improvement
>          Components: KafkaConnect
>    Affects Versions: 2.4.0
>            Reporter: Nicolas Guyomar
>            Priority: Minor
>
> Hi,
> It is possible with the current implementation that we log a full record 
> content at DEBUG level, which can overwhelmed log4j buffer and OOM it : 
> That stack trace was due to a 70MB messages refused by a broker 
> {code:java}
> java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:3332)
> at 
> java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
> at java.lang.StringBuffer.append(StringBuffer.java:270)
> at 
> org.apache.log4j.helpers.PatternParser$LiteralPatternConverter.format(PatternParser.java:419)
> at org.apache.log4j.PatternLayout.format(PatternLayout.java:506)
> at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310)
> at 
> org.apache.log4j.RollingFileAppender.subAppend(RollingFileAppender.java:276)
> at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
> at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
> at 
> org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
> at org.apache.log4j.Category.callAppenders(Category.java:206)
> at org.apache.log4j.Category.forcedLog(Category.java:391)
> at org.apache.log4j.Category.log(Category.java:856)
> at org.slf4j.impl.Log4jLoggerAdapter.debug(Log4jLoggerAdapter.java:252)
> at 
> org.apache.kafka.connect.runtime.WorkerSourceTask$1.onCompletion(WorkerSourceTask.java:330){code}
>   in  
> [https://github.com/apache/kafka/blob/da4337271ef0b72643c0cf47ae51e69f79cf1901/connect/runtime/src/main/java/org/apache/kafka/connect/runtime/WorkerSourceTask.java#L348]
> Would it make sense to protect Connect directly in the ConnectRecord 
> toString() method and set a configurable limit ? 
>  Thank you
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to