Jdbc Source Connector Holding more active sessions

2021-04-19 Thread vishnu murali
Hi everyone

we are using Kafka JDBC connector to connect to Redshift database

we created three connectors to pull the data from three different table to
three different topic

we observed that each connector holding multiple active sessions

even if we delete the connectors, active sessions are still available and
not closing

Did anyone faced same kind of issues?

is there any configuration is available to restrict this kind of behaviuor?

can anyone know how to overcome this issues?


Re: JDBC source connector

2020-05-14 Thread Robin Moffatt
If you just want it once then delete the connector once it's processed all
the data


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Thu, 14 May 2020 at 16:14, vishnu murali 
wrote:

> Thanks Liam
>
> But I am asking like assume  I am having 10.
>
> Using JDBC source I need to push that once..
>
> No more additional data will be added in future in that table.
>
> In that case i need to push that only once not more than one...
>
> For this scenario I am asking!!
>
> On Thu, May 14, 2020, 19:20 Liam Clarke-Hutchinson <
> liam.cla...@adscale.co.nz> wrote:
>
> > Why not use autoincrement? It'll only emit new records on subsequent
> polls
> > then.
> >
> > On Thu, 14 May 2020, 11:15 pm vishnu murali,  >
> > wrote:
> >
> > > Hi Guys,
> > >
> > > I am using the mode *bulk  *and poll.interval.ms *10* in the
> Source
> > > connector configuration.
> > >
> > > But I don't need to load data another time.?
> > >
> > > I need to load the data only once ??
> > >
> > > How can I able to do this ?
> > >
> >
>


Re: JDBC source connector

2020-05-14 Thread vishnu murali
Thanks Liam

But I am asking like assume  I am having 10.

Using JDBC source I need to push that once..

No more additional data will be added in future in that table.

In that case i need to push that only once not more than one...

For this scenario I am asking!!

On Thu, May 14, 2020, 19:20 Liam Clarke-Hutchinson <
liam.cla...@adscale.co.nz> wrote:

> Why not use autoincrement? It'll only emit new records on subsequent polls
> then.
>
> On Thu, 14 May 2020, 11:15 pm vishnu murali, 
> wrote:
>
> > Hi Guys,
> >
> > I am using the mode *bulk  *and poll.interval.ms *10* in the Source
> > connector configuration.
> >
> > But I don't need to load data another time.?
> >
> > I need to load the data only once ??
> >
> > How can I able to do this ?
> >
>


Re: JDBC source connector

2020-05-14 Thread Liam Clarke-Hutchinson
Why not use autoincrement? It'll only emit new records on subsequent polls
then.

On Thu, 14 May 2020, 11:15 pm vishnu murali, 
wrote:

> Hi Guys,
>
> I am using the mode *bulk  *and poll.interval.ms *10* in the Source
> connector configuration.
>
> But I don't need to load data another time.?
>
> I need to load the data only once ??
>
> How can I able to do this ?
>


JDBC source connector

2020-05-14 Thread vishnu murali
Hi Guys,

I am using the mode *bulk  *and poll.interval.ms *10* in the Source
connector configuration.

But I don't need to load data another time.?

I need to load the data only once ??

How can I able to do this ?


Re: Jdbc Source Connector Config

2020-05-13 Thread Robin Moffatt
You don't have to use Single Message Transform (which is what these are) at
all if you don't want to.
However, they do serve a useful purpose where you want to modify data as it
passes through Kafka Connect.

Ref:
-
https://www.confluent.io/blog/simplest-useful-kafka-connect-data-pipeline-world-thereabouts-part-3/
-
https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/
- http://rmoff.dev/ksldn19-kafka-connect


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Wed, 13 May 2020 at 10:34, vishnu murali 
wrote:

> Hi Guys,
>
> i am having a question.
>
>
> "transforms":"createKey,extractInt",
>
>
> "transforms.createKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
>  "transforms.createKey.fields":"id",
>
>
> "transforms.extractInt.type":"org.apache.kafka.connect.transforms.ExtractField$Key",
>  "transforms.extractInt.field":"id"
>
> what is the need of these configuration in JdbcSourceConnector?
> And without these can we use SourceConnector?
>


Jdbc Source Connector Config

2020-05-13 Thread vishnu murali
Hi Guys,

i am having a question.


"transforms":"createKey,extractInt",

"transforms.createKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
 "transforms.createKey.fields":"id",

"transforms.extractInt.type":"org.apache.kafka.connect.transforms.ExtractField$Key",
 "transforms.extractInt.field":"id"

what is the need of these configuration in JdbcSourceConnector?
And without these can we use SourceConnector?


Re: JDBC source connector to Kafka topic

2020-04-29 Thread vishnu murali
I am using normal Apache Kafka only not confluent

So step by step only I am starting Kafka

.\bin\windows\kafka-server-start.bat .\config\server.properties

Like this only I am starting and even before shutdown also I need to send
configuration details every time through post request to get the data from
DB.

On Thu, Apr 30, 2020, 02:04 Robin Moffatt  wrote:

> How are you running Kafka? Do you mean when you shut it down you have to
> reconfigure the connector?
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Wed, 29 Apr 2020 at 20:03, vishnu murali 
> wrote:
>
> > Hi guys
> > I am trying that JDBC source connector to get data from MySQL and send
> as a
> > data in a topic,so here what I am facing is there is more manual here
> >
> > After starting zookeeper,server, connect-distributed in Apache kafka
> >
> > I need to give Post request every time to the localhost:8083/connectors
> > with the request body of config details when I need data and also all
> data
> > will come again and again..
> >
> > Is there any way to achieve this CDC?
> >
>


Re: JDBC source connector to Kafka topic

2020-04-29 Thread Robin Moffatt
How are you running Kafka? Do you mean when you shut it down you have to
reconfigure the connector?


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Wed, 29 Apr 2020 at 20:03, vishnu murali 
wrote:

> Hi guys
> I am trying that JDBC source connector to get data from MySQL and send as a
> data in a topic,so here what I am facing is there is more manual here
>
> After starting zookeeper,server, connect-distributed in Apache kafka
>
> I need to give Post request every time to the localhost:8083/connectors
> with the request body of config details when I need data and also all data
> will come again and again..
>
> Is there any way to achieve this CDC?
>


JDBC source connector to Kafka topic

2020-04-29 Thread vishnu murali
Hi guys
I am trying that JDBC source connector to get data from MySQL and send as a
data in a topic,so here what I am facing is there is more manual here

After starting zookeeper,server, connect-distributed in Apache kafka

I need to give Post request every time to the localhost:8083/connectors
with the request body of config details when I need data and also all data
will come again and again..

Is there any way to achieve this CDC?