Hi guys
I am trying that JDBC source connector to get data from MySQL and send as a
data in a topic,so here what I am facing is there is more manual here
After starting zookeeper,server, connect-distributed in Apache kafka
I need to give Post request every time to the localhost:8083/connectors
luence/pages/viewpage.action?pageId=27846330
>
> This can do what you want. Note that the offsets are not copied, nor are
> the message timestamps.
>
> HTH
>
>
> On Wed, Apr 29, 2020 at 6:47 PM vishnu murali
> wrote:
>
> > Hi Guys,
> >
> > I am having two s
Hi Guys,
I am having two separate Kafka cluster running in two independent zookeeper
I need to send a set of data from one topic from cluster A to cluster B
with the same topic name with all data also..
How can I achieve this
Done anyone have any idea ??
Hey Guys,
I am trying to move data between one cluster to another cluster
*Source*
*Destination*
*Zookeeper*
2181
2182
*Kafka*
9092
9091
*ConsumerProperties:*
bootstrap.servers=localhost:9092
group.id=test-consumer-group
auto.offset.rest=earliest
*Producer Properties:*
are looking for.
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Fri, 1 May 2020 at 13:57, vishnu murali
> wrote:
>
> > Hi Guys
> >
> > Previously I asked question about the Mirror maker and it is solved now.
>
Hi Guys
Previously I asked question about the Mirror maker and it is solved now.
So Now I need to know is there any connectors available for that same.
Like JdbcConnector acts as a source and sink for DB connection is there any
connector available for performing mirror operations
or
does
; https://cwiki.apache.org/confluence/display/KAFKA/KIP-382%3A+MirrorMaker+2.0#KIP-382:MirrorMaker2.0-Walkthrough:RunningMirrorMaker2.0
>
> In the same vein, any questions, hit me up,
>
> Liam Clarke-Hutchinson
>
> On Sat, May 2, 2020 at 9:56 PM vishnu murali
> wrote:
>
> >
Hey Guys,
Here By i am posting stack trace occured in the connect-distributed while
giving mirror connector configurations:
*Post*:http://localhost:8083/connectors
*Request json Body:*
{
"name": "us-west-sourc",
"config": {
"connector.class":
Hey Guys
I am using Apache version of 2.5
Correct me if I am wrong!!
Here there is a jar file called Connect-Mirror2.5.0 in the libs folder.I
think it is a connector to copy the topic data between one cluster to
another cluster like MirrorMaker..
So I started zookeeper
I started Kafka server
Hi guys
I am having Kafka topic of Films having 2000 data.
In spring boot KafkaListener I am listening that particular topic
But I need to process the every data one at a time after that only I need
to consume the next record..
How can I overcome this scenario?
Any idea for this scenario?
the data from
DB.
On Thu, Apr 30, 2020, 02:04 Robin Moffatt wrote:
> How are you running Kafka? Do you mean when you shut it down you have to
> reconfigure the connector?
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Wed
er to make this switch in
advance of that release add the following to the corresponding config:
'partition.assignment.strategy=org.apache.kafka.clients.consumer.RoundRobinAssignor'
can anyone clarify why i am getting this and what i am doing wrong?
On Thu, Apr 30, 2020 at 12:22 AM vishnu mur
Hi Guys,
I am having two separate Kafka cluster running in two independent zookeeper
I need to send a set of data from one topic from cluster A to cluster B
with the same topic name with all data also..
How can I achieve this
Done anyone have any idea ??
Hi Himanshu
Can u pls tell how to use MM2..
I am using Apache Kafka,in this normal mirror maker is only available..
Most of the people saying like to use MM2 but I didn't able to know where
to get that MM2.
Is it related to Apache or from some other Distributors?
Can u pls explain how to
gt; 1. https://docs.confluent.io/current/connect/devguide.html
> 2.
>
> https://www.confluent.io/blog/create-dynamic-kafka-connect-source-connectors/
>
> Cheers,
>
> Tom
>
>
> On Tue, May 12, 2020 at 7:34 AM vishnu murali
> wrote:
>
> > Hi Guys,
> >
Hi Guys,
i am trying to create a new connector for my own purpose.
Is there any guide or document which show how to create a own connector and
use?
Hi friends,
I am having Rest Endpoint and data is receiving in that endpoint
continuously..
I need to send that data to the Kafka topic ..
For these above scenarios I need to solve using connector..
Because I didn't want to run another application to receive data from rest
and send to Kafka.
Hey Guys,
i am working on JDBC Sink Conneector to take data from kafka topic to mysql.
i am having 2 questions.
i am using normal Apache Kafka 2.5 not a confluent version.
1)For inserting data every time we need to add the schema data also with
every data,How can i overcome this situation?i
cess 1 record at a time.
>
> See https://kafka.apache.org/documentation/#consumerconfigs .
>
> On Mon, May 4, 2020 at 1:04 AM vishnu murali
> wrote:
>
> > Hey Guys,
> >
> > I am having a topic and in that topic I am having 3000 messages
> >
> > In my s
m.
> >
> > See https://docs.confluent.io/current/connect/concepts.html#converters
> >
> > Chris
> >
> >
> >
> > On Fri, May 8, 2020 at 6:59 AM vishnu murali >
> > wrote:
> >
> > > Hey Guys,
> > >
> > > I am *u
Hey Guys,
I am *using Apache **2.5 *not confluent.
i am trying to send data from topic to database using jdbc sink connector.
we need to send that data with the appropriate schema also.
i am *not using confluent version* of kafka.
so can anyone explain how can i do this ?
| Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Thu, 7 May 2020 at 06:48, vishnu murali
> wrote:
>
> > Hey Guys,
> >
> > i am working on JDBC Sink Conneector to take data from kafka topic to
> > mysql.
> >
> > i am having 2 ques
Hey Guys,
Now i am trying to implement SFTP connector by using this configurations.
i am using windows system.
so i am having doubts like how to set a path?
i tried to set like this in *config *as well as */mnt/c/users/vmuralidharan*
but it doesn't work.
so what i need to do ?
{
Hi Guys
By Trying SFTP CSV SOURCE i am getting this exception by using this
configuration.
what is the issue and how to resolve it?
can anyone know?
*Config:*
{
"name": "CsvSFTP1",
"config": {
"tasks.max": "1",
"connector.class":
Hi Guys,
I am using the mode *bulk *and poll.interval.ms *10* in the Source
connector configuration.
But I don't need to load data another time.?
I need to load the data only once ??
How can I able to do this ?
Hi i am running cp-all-in one docker for the confluent kafka
There i am trying that JDBCSourceConnector.
it is showing this results..
{
"error_code": 400,
"message":
"Connector configuration is invalid and contains the following 2
error(s):\nInvalid value java.sql.SQLException: No
Liam Clarke-Hutchinson <
liam.cla...@adscale.co.nz> wrote:
> Why not use autoincrement? It'll only emit new records on subsequent polls
> then.
>
> On Thu, 14 May 2020, 11:15 pm vishnu murali,
> wrote:
>
> > Hi Guys,
> >
> > I am using the mode *bulk *a
Hi Guys,
i am having a question.
"transforms":"createKey,extractInt",
"transforms.createKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
"transforms.createKey.fields":"id",
"transforms.extractInt.type":"org.apache.kafka.connect.transforms.ExtractField$Key",
Hi Guys
By Trying SFTP CSV SOURCE i am getting this exception by using this
configuration.
what is the issue and how to resolve it?
can anyone know?
*Config:*
{
"name": "CsvSFTP1",
"config": {
"tasks.max": "1",
"connector.class":
Hey Guys,
Now i am trying to implement SFTP connector by using this configurations.
i am using windows system.
so i am having doubts like how to set a path?
i tried to set like this in *config *as well as */mnt/c/users/vmuralidharan*
but it doesn't work.
so what i need to do ?
{
> thanks, Robin.
>
> On Tue, 19 May 2020 at 16:44, vishnu murali
> wrote:
>
> > Hi Guys
> >
> > By Trying SFTP CSV SOURCE i am getting this exception by using this
> > configuration.
> >
> >
> > what is the issue and how to resolve it?
&g
Hey guys
I changed the properties in SFTP CSV source and it is working fine..
Now I set the schema generation enabled true .so it is adding the schema
data to every data into the topic ..
So when I set that generation false it ask for key.schema and value.schema
But both will be json and I
Hey Guys,
one of the value for SFTP configuration is key.schema.
i am giving through postman as a json request
So how can i give the schema details ,Because it is having DoubleQuotes For
all key and value?
could anyone explain?
if i give like this this exception is coming.
"key.schema":
Hi guys
I am using JDBC SOURCE CONNECTOR to take data from AWS Redshift to Kafka
There I am having a field with Datatype as Date.
So while performing the value which is greater than 1970 works fine.
But if value is before 1970 it provide 00:00:00:Z
May I know how to solve this problem?
Does
Hi everyone
I am getting data from SFTP CSV Source connector.
In this I need to add header with that message for consuming in Java side.
So anyone knows how to add header in the message while using connector?
Thanks in advance .
gt; https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Thu, 2 Jul 2020 at 13:54, vishnu murali
> wrote:
>
> > Hi Guys,
&
Hi Guys,
I am having some problem while reading from MySQL using JDBC source and
received like below
Anyone know what is the reason and how to solve this ?
"a": "Aote",
"b": "AmrU",
"c": "AceM",
"d": "Aote",
Instead of
"a": 0.002,
"b": 0.465,
"c": 0.545,
"d": 0.100
It's my
Hi Guys
First I am creating a topic and set a schema first and then I am trying to
take data from MySQL using JDBC source connector.
In that time how can I validate the data from MySQL matches the schema I
set in schema registry?
Can any one have any idea about this??
Hi all,
I am using SFTP connector which that SFTP connection can be accessed by
using public key file.
How can I give this configuration in postman to start sftp connector?
Anyone have any suggestions?
..
More escape sequence will be there in the public key
In this situation do u know how can we use this??
On Wed, Jul 15, 2020, 02:06 Ricardo Ferreira wrote:
> Vishnu,
>
> A public key file can be specified via the property `tls.public.key`.
> Thanks,
>
> -- Ricardo
> On 7/14/20 6
Hi all,
I am having questions on namespace in schema registry
If schema is automatically generate from JDBC source connector Means then
the schema doesn't have namespace field and value
But if we created schema manually with namespace and register into schema
registry and if I try to run
e an example of the rows contained in
> the table `sample`? As well as its DDL?
>
> -- Ricardo
>
> On 7/2/20 9:29 AM, vishnu murali wrote:
> > I go through that documentation
> >
> > Where it described like DECIMAL is not supported in MySQL like this .
> >
> > A
Hi all
My schema is:
{
"fields": [
{
"name": "ID",
"type": "long"
},
{
"name": "Group",
"type": [
"null",
{
"avro.java.string": "String",
"type": "string"
}
]
},
{
"name": "Key",
at
io.confluent.kafka.schemaregistry.validator.LruSchemaRegistryClient.getSchemaByIdFromRegistry(LruSchemaRegistryClient.java:188)
On Thu, Jul 16, 2020, 09:31 vishnu murali
wrote:
> Hi all
>
> My schema is:
>
>
>
> {
> "fields": [
>
> {
>
ion Server and cURL*: you can use a bastion server to SSH
> from your machine and then have access to the machine that hosts your Kafka
> Connect server. While in there; you can use cURL to execute the POST
> command along with the `--data-urlencode` parameter.
>
> Thanks,
>
> -- Rica
Hi all
I am trying to send the data from Kafka Java producer in the format of Avro
While trying to send data it is not sent.
Before and after statement of send is executing correctly.But that sending
alone is not working
But it register the schema successfully..
No logs or error message is
Hi everyone
we are using Kafka JDBC connector to connect to Redshift database
we created three connectors to pull the data from three different table to
three different topic
we observed that each connector holding multiple active sessions
even if we delete the connectors, active sessions are
47 matches
Mail list logo