Hi all, I was finally able to get the Kafka connection running with some kafka server related configuration.
For information, I'm using the docker image from https://hub.docker.com/r/bitnami/kafka/ This is creating 2 container (Zookeeper and Kafka). I've changed the docker-compose.yml as following with the lines - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,EXTERNAL://:9093 - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://:9092,EXTERNAL://localhost:9093 - KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=PLAINTEXT:PLAINTEXT,EXTERNAL:PLAINTEXT (see full yml below). This allows an internal connection from camel-kafka inside the docker network as well as an external communication. Best - Gerald version: '2' services: zookeeper: container_name: zookeeper image: 'docker.io/bitnami/zookeeper:3-debian-10' ports: - '2181:2181' volumes: - 'zookeeper_data:/bitnami' environment: - ALLOW_ANONYMOUS_LOGIN=yes kafka: container_name: kafka image: 'docker.io/bitnami/kafka:2-debian-10' ports: - '9092:9092' - '9093:9093' volumes: - 'kafka_data:/bitnami' environment: - KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181 - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,EXTERNAL://:9093 - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://:9092,EXTERNAL://localhost:9093 - KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=PLAINTEXT:PLAINTEXT,EXTERNAL:PLAINTEXT - ALLOW_PLAINTEXT_LISTENER=yes depends_on: - zookeeper volumes: zookeeper_data: driver: local kafka_data: driver: local networks: default: external: name: casisp > Jean-Baptiste Onofre <j...@nanthrax.net> hat am 29.06.2020 08:14 geschrieben: > > > Hi, > > Don’t you have another trace in the log, like missing Avro packages > (ClassNotFoundException) in the Kafka bundle client ? > > I saw an issue with pulsar bundle missing Avro import (it used avro.* import > whereas "new" Avro release uses org.apache.avro.*). Maybe Avro optional > import is required in your case (even if Kafka-clients doesn’t define the > dependency explicitly). > > Regards > JB > > > Le 27 juin 2020 à 09:45, Gerald Kallas <catsh...@mailbox.org> a écrit : > > > > Dear all, > > > > I've setup Camel 3.4.0 and want to consume from a Kafka topic. My route > > looks like > > > > <blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"> > > <camelContext xmlns="http://camel.apache.org/schema/blueprint" > > streamCache="true"> > > <route> > > <from > > uri="kafka:casisp?brokers=ec2-3-124-33-3.eu-central-1.compute.amazonaws.com:9092" > > /> > > <log message="body: ${in.body}" /> > > </route> > > </camelContext> > > </blueprint> > > > > The log shows the connection error as below. > > > > A connection from Kafka CLI or another tool works without problems. > > > > Any hints are appreciated. > > > > - Gerald > > > > > > Log file excerpt > > > > 2020-06-27T07:37:29,692 | INFO | Blueprint Event Dispatcher: 1 | > > KafkaConsumer | 120 - org.apache.camel.camel-kafka - > > 3.4.0 | Starting Kafka consumer on topic: casisp with breakOnFirstError: > > false > > 2020-06-27T07:37:29,710 | INFO | Blueprint Event Dispatcher: 1 | > > ConsumerConfig | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | ConsumerConfig > > values: > > allow.auto.create.topics = true > > auto.commit.interval.ms = 5000 > > auto.offset.reset = latest > > bootstrap.servers = > > [ec2-3-124-33-3.eu-central-1.compute.amazonaws.com:9092] > > check.crcs = true > > client.dns.lookup = default > > client.id = > > client.rack = > > connections.max.idle.ms = 540000 > > default.api.timeout.ms = 60000 > > enable.auto.commit = true > > exclude.internal.topics = true > > fetch.max.bytes = 52428800 > > fetch.max.wait.ms = 500 > > fetch.min.bytes = 1 > > group.id = 689cc1d7-fcf7-4565-9056-ddd7218d261f > > group.instance.id = null > > heartbeat.interval.ms = 3000 > > interceptor.classes = [] > > internal.leave.group.on.close = true > > isolation.level = read_uncommitted > > key.deserializer = class > > org.apache.kafka.common.serialization.StringDeserializer > > max.partition.fetch.bytes = 1048576 > > max.poll.interval.ms = 300000 > > max.poll.records = 500 > > metadata.max.age.ms = 300000 > > metric.reporters = [] > > metrics.num.samples = 2 > > metrics.recording.level = INFO > > metrics.sample.window.ms = 30000 > > partition.assignment.strategy = > > [org.apache.kafka.clients.consumer.RangeAssignor] > > receive.buffer.bytes = 65536 > > reconnect.backoff.max.ms = 1000 > > reconnect.backoff.ms = 50 > > request.timeout.ms = 40000 > > retry.backoff.ms = 100 > > sasl.client.callback.handler.class = null > > sasl.jaas.config = null > > sasl.kerberos.kinit.cmd = /usr/bin/kinit > > sasl.kerberos.min.time.before.relogin = 60000 > > sasl.kerberos.service.name = null > > sasl.kerberos.ticket.renew.jitter = 0.05 > > sasl.kerberos.ticket.renew.window.factor = 0.8 > > sasl.login.callback.handler.class = null > > sasl.login.class = null > > sasl.login.refresh.buffer.seconds = 300 > > sasl.login.refresh.min.period.seconds = 60 > > sasl.login.refresh.window.factor = 0.8 > > sasl.login.refresh.window.jitter = 0.05 > > sasl.mechanism = GSSAPI > > security.protocol = PLAINTEXT > > security.providers = null > > send.buffer.bytes = 131072 > > session.timeout.ms = 10000 > > ssl.cipher.suites = null > > ssl.enabled.protocols = [TLSv1.2] > > ssl.endpoint.identification.algorithm = https > > ssl.key.password = null > > ssl.keymanager.algorithm = SunX509 > > ssl.keystore.location = null > > ssl.keystore.password = null > > ssl.keystore.type = JKS > > ssl.protocol = TLSv1.2 > > ssl.provider = null > > ssl.secure.random.implementation = null > > ssl.trustmanager.algorithm = PKIX > > ssl.truststore.location = null > > ssl.truststore.password = null > > ssl.truststore.type = JKS > > value.deserializer = class > > org.apache.kafka.common.serialization.StringDeserializer > > > > 2020-06-27T07:37:29,884 | WARN | Blueprint Event Dispatcher: 1 | > > ConsumerConfig | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | The configuration > > 'specific.avro.reader' was supplied but isn't a known config. > > 2020-06-27T07:37:29,884 | INFO | Blueprint Event Dispatcher: 1 | > > AppInfoParser | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | Kafka version: 2.4.1 > > 2020-06-27T07:37:29,885 | INFO | Blueprint Event Dispatcher: 1 | > > AppInfoParser | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | Kafka commitId: > > c57222ae8cd7866b > > 2020-06-27T07:37:29,885 | INFO | Blueprint Event Dispatcher: 1 | > > AppInfoParser | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | Kafka startTimeMs: > > 1593243449884 > > 2020-06-27T07:37:29,888 | INFO | Blueprint Event Dispatcher: 1 | > > InternalRouteStartupManager | 95 - org.apache.camel.camel-base - 3.4.0 > > | Route: route9 started and consuming from: kafka://casisp > > 2020-06-27T07:37:29,890 | INFO | Blueprint Event Dispatcher: 1 | > > AbstractCamelContext | 95 - org.apache.camel.camel-base - 3.4.0 > > | Total 1 routes, of which 1 are started > > 2020-06-27T07:37:29,892 | INFO | Blueprint Event Dispatcher: 1 | > > AbstractCamelContext | 95 - org.apache.camel.camel-base - 3.4.0 > > | Apache Camel 3.4.0 (camel-9) started in 0.374 seconds > > 2020-06-27T07:37:29,893 | INFO | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | KafkaConsumer | 120 - > > org.apache.camel.camel-kafka - 3.4.0 | Subscribing casisp-Thread 0 to topic > > casisp > > 2020-06-27T07:37:29,900 | INFO | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | KafkaConsumer | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] Subscribed to topic(s): casisp > > 2020-06-27T07:37:29,918 | INFO | > > fileinstall-/opt/apache-karaf-4.2.9/deploy | fileinstall > > | 10 - org.apache.felix.fileinstall - 3.6.6 | Started bundle: > > blueprint:file:/opt/apache-karaf-4.2.9/deploy/isp.route.TSTISP001.xml > > 2020-06-27T07:37:30,096 | INFO | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | Metadata | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] Cluster ID: > > NIuHmfEiSOGS7Bm56dOngg > > 2020-06-27T07:37:30,097 | INFO | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | AbstractCoordinator | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] Discovered group coordinator > > localhost:9092 (id: 2147482646 rack: null) > > 2020-06-27T07:37:30,098 | INFO | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | AbstractCoordinator | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] (Re-)joining group > > 2020-06-27T07:37:30,100 | WARN | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | NetworkClient | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] Connection to node 2147482646 > > (localhost/127.0.0.1:9092) could not be established. Broker may not be > > available. > > 2020-06-27T07:37:30,100 | INFO | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | AbstractCoordinator | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] Group coordinator > > localhost:9092 (id: 2147482646 rack: null) is unavailable or invalid, will > > attempt rediscovery > > 2020-06-27T07:37:30,202 | WARN | Camel (camel-9) thread #21 - > > KafkaConsumer[casisp] | NetworkClient | 195 - > > org.apache.servicemix.bundles.kafka-clients - 2.4.1.1 | [Consumer > > clientId=consumer-689cc1d7-fcf7-4565-9056-ddd7218d261f-9, > > groupId=689cc1d7-fcf7-4565-9056-ddd7218d261f] Connection to node 1001 > > (localhost/127.0.0.1:9092) could not be established. Broker may not be > > available.