[ 
https://issues.apache.org/jira/browse/KAFKA-9747?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vit Koma updated KAFKA-9747:
----------------------------
     Attachment: connect-distributed.properties
    Description: 
We are running Kafka Connect in a distributed mode on 3 nodes using Debezium 
(MongoDB) and Confluent S3 connectors. When adding a new connector via the REST 
API the connector is created in RUNNING state, but no tasks are created for the 
connector.

Pausing and resuming the connector does not help. When we stop all workers and 
then start them again, the tasks are created and everything runs as it should.

The issue does not show up if we run only a single node.

The issue is not caused by the connector plugins, because we see the same 
behaviour for both Debezium and S3 connectors. Also in debug logs I can see 
that Debezium is correctly returning a task configuration from the 
Connector.taskConfigs() method.

Connector configuration examples

Debezium:

{
  "name": "qa-mongodb-comp-converter-task|1",
  "config": {
    "connector.class": "io.debezium.connector.mongodb.MongoDbConnector",
    "mongodb.hosts": 
"mongodb-qa-001:27017,mongodb-qa-002:27017,mongodb-qa-003:27017",
    "mongodb.name": "qa-debezium-comp",
    "mongodb.ssl.enabled": true,
    "collection.whitelist": "converter[.]task",
    "tombstones.on.delete": true
  }
}

S3 Connector:

{
  "name": "qa-s3-sink-task|1",
  "config": {
    "connector.class": "io.confluent.connect.s3.S3SinkConnector",
    "topics": "qa-debezium-comp.converter.task",
    "topics.dir": "data/env/qa",
    "s3.region": "eu-west-1",
    "s3.bucket.name": "<bucket-name>",
    "flush.size": "15000",
    "rotate.interval.ms": "3600000",
    "storage.class": "io.confluent.connect.s3.storage.S3Storage",
    "format.class": "custom.kafka.connect.s3.format.plaintext.PlaintextFormat",
    "schema.generator.class": 
"io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
    "partitioner.class": 
"io.confluent.connect.storage.partitioner.DefaultPartitioner",
    "schema.compatibility": "NONE",
    "key.converter": "org.apache.kafka.connect.json.JsonConverter",
    "value.converter": "org.apache.kafka.connect.json.JsonConverter",
    "key.converter.schemas.enable": false,
    "value.converter.schemas.enable": false,
    "transforms": "ExtractDocument",
    
"transforms.ExtractDocument.type":"custom.kafka.connect.transforms.ExtractDocument$Value"
  }
}

The connectors are created using curl: {{curl -X POST -H "Content-Type: 
application/json" --data @<json_file> http:/<connect_host>:10083/connectors}}



  was:
We are running Kafka Connect in a distributed mode on 3 nodes using Debezium 
(MongoDB) and Confluent S3 connectors. When adding a new connector via the REST 
API the connector is created in RUNNING state, but no tasks are created for the 
connector.

Pausing and resuming the connector does not help. When we stop all workers and 
then start them again, the tasks are created and everything runs as it should.

The issue does not show up if we run only a single node.

The issue is not caused by the connector plugins, because we see the same 
behaviour for both Debezium and S3 connectors. Also in debug logs I can see 
that Debezium is correctly returning a task configuration from the 
Connector.taskConfigs() method.


> No tasks created for a connector
> --------------------------------
>
>                 Key: KAFKA-9747
>                 URL: https://issues.apache.org/jira/browse/KAFKA-9747
>             Project: Kafka
>          Issue Type: Bug
>          Components: KafkaConnect
>    Affects Versions: 2.4.0
>         Environment: OS: Ubuntu 18.04 LTS
> Platform: Confluent Platform 5.4
> HW: The same behaviour on various AWS instances - from t3.small to c5.xlarge
>            Reporter: Vit Koma
>            Priority: Major
>         Attachments: connect-distributed.properties
>
>
> We are running Kafka Connect in a distributed mode on 3 nodes using Debezium 
> (MongoDB) and Confluent S3 connectors. When adding a new connector via the 
> REST API the connector is created in RUNNING state, but no tasks are created 
> for the connector.
> Pausing and resuming the connector does not help. When we stop all workers 
> and then start them again, the tasks are created and everything runs as it 
> should.
> The issue does not show up if we run only a single node.
> The issue is not caused by the connector plugins, because we see the same 
> behaviour for both Debezium and S3 connectors. Also in debug logs I can see 
> that Debezium is correctly returning a task configuration from the 
> Connector.taskConfigs() method.
> Connector configuration examples
> Debezium:
> {
>   "name": "qa-mongodb-comp-converter-task|1",
>   "config": {
>     "connector.class": "io.debezium.connector.mongodb.MongoDbConnector",
>     "mongodb.hosts": 
> "mongodb-qa-001:27017,mongodb-qa-002:27017,mongodb-qa-003:27017",
>     "mongodb.name": "qa-debezium-comp",
>     "mongodb.ssl.enabled": true,
>     "collection.whitelist": "converter[.]task",
>     "tombstones.on.delete": true
>   }
> }
> S3 Connector:
> {
>   "name": "qa-s3-sink-task|1",
>   "config": {
>     "connector.class": "io.confluent.connect.s3.S3SinkConnector",
>     "topics": "qa-debezium-comp.converter.task",
>     "topics.dir": "data/env/qa",
>     "s3.region": "eu-west-1",
>     "s3.bucket.name": "<bucket-name>",
>     "flush.size": "15000",
>     "rotate.interval.ms": "3600000",
>     "storage.class": "io.confluent.connect.s3.storage.S3Storage",
>     "format.class": 
> "custom.kafka.connect.s3.format.plaintext.PlaintextFormat",
>     "schema.generator.class": 
> "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
>     "partitioner.class": 
> "io.confluent.connect.storage.partitioner.DefaultPartitioner",
>     "schema.compatibility": "NONE",
>     "key.converter": "org.apache.kafka.connect.json.JsonConverter",
>     "value.converter": "org.apache.kafka.connect.json.JsonConverter",
>     "key.converter.schemas.enable": false,
>     "value.converter.schemas.enable": false,
>     "transforms": "ExtractDocument",
>     
> "transforms.ExtractDocument.type":"custom.kafka.connect.transforms.ExtractDocument$Value"
>   }
> }
> The connectors are created using curl: {{curl -X POST -H "Content-Type: 
> application/json" --data @<json_file> http:/<connect_host>:10083/connectors}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to