[
https://issues.apache.org/jira/browse/KAFKA-9504?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17032739#comment-17032739
]
Ted Yu commented on KAFKA-9504:
-------------------------------
It seems the closing of metrics is not enough in terms of preventing memory
leak:
{code}
Utils.closeQuietly(kafkaConsumerMetrics, "kafka consumer metrics",
firstException);
Utils.closeQuietly(metrics, "consumer metrics", firstException);
{code}
> Memory leak in KafkaMetrics registered to MBean
> -----------------------------------------------
>
> Key: KAFKA-9504
> URL: https://issues.apache.org/jira/browse/KAFKA-9504
> Project: Kafka
> Issue Type: Bug
> Components: clients
> Affects Versions: 2.4.0
> Reporter: Andreas Holmén
> Priority: Major
>
> After close() called on a KafkaConsumer some registered MBeans are not
> unregistered causing leak.
>
>
> {code:java}
> import static
> org.apache.kafka.clients.consumer.ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG;
> import java.lang.management.ManagementFactory;
> import java.util.HashMap;
> import java.util.Map;
> import javax.management.MBeanServer;
> import org.apache.kafka.clients.consumer.KafkaConsumer;
> import org.apache.kafka.common.serialization.ByteArrayDeserializer;
> public class Leaker {
> private static String bootstrapServers = "hostname:9092";
>
> public static void main(String[] args) throws InterruptedException {
> MBeanServer mBeanServer = ManagementFactory.getPlatformMBeanServer();
> Map<String, Object> props = new HashMap<>();
> props.put(BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
>
> int beans = mBeanServer.getMBeanCount();
> for (int i = 0; i < 100; i++) {
> KafkaConsumer<byte[], byte[]> consumer = new KafkaConsumer<>(props, new
> ByteArrayDeserializer(), new ByteArrayDeserializer());
> consumer.close();
> }
> int newBeans = mBeanServer.getMBeanCount();
> System.out.println("\nbeans delta: " + (newBeans - beans));
> }
> }
> {code}
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)