[
https://issues.apache.org/jira/browse/KAFKA-7217?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Greg Harris resolved KAFKA-7217.
Fix Version/s: 1.1.0
Resolution: Fixed
> Loading dynamic topic data into kafka connec
ngliche Nachricht-
> Von: Chris Egerton
> Gesendet: Donnerstag, 2. März 2023 17:33
> An: dev@kafka.apache.org
> Betreff: Re: Problems with Kafka Connector Base classes
>
> [You don't often get email from chr...@aiven.io.invalid. Learn why this
> is important at https://a
G_FOR_TASK_CONFIG" - that
would be awesome 😉
Thanks
Michael
-Ursprüngliche Nachricht-
Von: Chris Egerton
Gesendet: Donnerstag, 2. März 2023 17:33
An: dev@kafka.apache.org
Betreff: Re: Problems with Kafka Connector Base classes
[You don't often get email from chr...@aiven.io
Hi Michael,
The purpose of the Connector::taskConfigs method is to specify (implicitly)
the number of tasks that need to be run for the connector, and (explicitly)
how each task should be configured. The former is provided implicitly by
the size of the list returned from that method; for each elem
Hi
I'm using: org.apache.kafka:connect-api:3.4.0
I have a simple connector:
==
public class SimpleSinkConnector extends SinkConnector {
@Override
public List> taskConfigs(int maxTasks) {
ArrayList> configs = new ArrayList<>();
for (int i = 0; i < maxTasks; i++) {
[
https://issues.apache.org/jira/browse/KAFKA-12795?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pengWei Dou resolved KAFKA-12795.
-
Resolution: Won't Fix
> Create kafka connector use chinese characte
pengWei Dou created KAFKA-12795:
---
Summary: Create kafka connector use chinese character failed
Key: KAFKA-12795
URL: https://issues.apache.org/jira/browse/KAFKA-12795
Project: Kafka
Issue Type
[
https://issues.apache.org/jira/browse/KAFKA-12182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Chris Egerton resolved KAFKA-12182.
---
Resolution: Duplicate
> Kafka connector is failing due to ' OffsetStorageWriter is
Fátima Galera created KAFKA-12182:
-
Summary: Kafka connector is failing due to ' OffsetStorageWriter
is already flushing' error
Key: KAFKA-12182
URL: https://issues.apache.org/jira/browse/K
Hi,
This is Monisha. I am a Netsuite Developer. I want to connect Netsuite and
Apache Kafka. It would be great if you provide me information about how to
connect Kafka and Netsuite.
Looking forward to your reply.
--
thank you.
Monisha Kodi
Phone: +49 15124770514
Dear teams,
We use Kafka-1.1.0 connector to load data,And start a connector using
rest api by application.
Before we start a connector,we'll check it was not existed and then
create it. This was encapsulated in a Quartz job.And each connector
had a job.
We use spring resttemplate as below
Hello! I have question. We have cluster with several connect workers. And we
have many different connectors. We need to set for each connector its own
settings, max.in.flight.requests.per.connection , partitioner.class, acks. But
I have difficulties. How can I do that? Thanks
Hi:
I am using the Kafka connector version is 2.0.0 and below is setting the
Kafka connector worker for https protocol.
1. Create the SSL key
keytool -keystore kafka.server.keystore.jks -alias localhost -validity 365
-genkey
openssl req -new -x509 -keyout ca-key -out ca-cert -days 365
keytool
Pratik Gaglani created KAFKA-7217:
-
Summary: Loading dynamic topic data into kafka connector sink
using regex
Key: KAFKA-7217
URL: https://issues.apache.org/jira/browse/KAFKA-7217
Project: Kafka
Chen He created KAFKA-5518:
--
Summary: General Kafka connector performanc workload
Key: KAFKA-5518
URL: https://issues.apache.org/jira/browse/KAFKA-5518
Project: Kafka
Issue Type: Bug
, victory_zgh wrote:
> Hi,
> Recently, I am working on kafka connector implement. According to the
> connector sink task document, the connect platform invoke flush method
> periodically, But the doc does not metition the flush interval.
> I don't know the flush interval default v
Hi,
Recently, I am working on kafka connector implement. According to the
connector sink task document, the connect platform invoke flush method
periodically, But the doc does not metition the flush interval.
I don't know the flush interval default value or Where we can set this
int
I thought this would be of interest:
https://developer.zendesk.com/blog/introducing-maxwell-a-mysql-to-kafka-binlog-processor
A copycat connector that parses MySQL binlogs would be rather useful I
think. Streaming connectors using jdbc are tricky to implement because they
rely on an indexed timest
18 matches
Mail list logo