Re: Problem in reading From JDBC SOURCE

2020-07-03 Thread vishnu murali
Hi


Below is the script I used to create table in Mysql


CREATE TABLE `sample` (

  `id` varchar(45) NOT NULL,

  `a` decimal(10,3) DEFAULT NULL,

  `b` decimal(10,3) DEFAULT NULL,

  `c` decimal(10,3) DEFAULT NULL,

  `d` decimal(10,3) DEFAULT NULL,

  PRIMARY KEY (`id`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;



 Table data

Id- 1

a- 0.002

b- 2.250

c- 0.789

d- 0.558

On Thu, Jul 2, 2020, 19:50 Ricardo Ferreira  wrote:

> Vishnu,
>
> I think is hard to troubleshoot things without the proper context. In
> your case, could you please share an example of the rows contained in
> the table `sample`? As well as its DDL?
>
> -- Ricardo
>
> On 7/2/20 9:29 AM, vishnu murali wrote:
> > I go through that documentation
> >
> > Where it described like DECIMAL is not supported in MySQL  like this .
> >
> > And also no example for MySQL so is there any other sample with MySQL
> >
> >
> >
> > On Thu, Jul 2, 2020, 18:49 Robin Moffatt  wrote:
> >
> >> Check out this article where it covers decimal handling:
> >>
> >>
> https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics
> >>
> >>
> >> --
> >>
> >> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
> >>
> >>
> >> On Thu, 2 Jul 2020 at 13:54, vishnu murali 
> >> wrote:
> >>
> >>> Hi Guys,
> >>>
> >>> I am having some problem while reading from MySQL using JDBC source and
> >>> received like below
> >>> Anyone know what is the reason and how to solve this ?
> >>>
> >>> "a": "Aote",
> >>>
> >>>"b": "AmrU",
> >>>
> >>>"c": "AceM",
> >>>
> >>>"d": "Aote",
> >>>
> >>>
> >>> Instead of
> >>>
> >>> "a": 0.002,
> >>>
> >>>"b": 0.465,
> >>>
> >>>"c": 0.545,
> >>>
> >>>"d": 0.100
> >>>
> >>>
> >>> It's my configuration
> >>>
> >>>
> >>> {
> >>>
> >>>  "name": "sample",
> >>>
> >>>  "config": {
> >>>
> >>>  "connector.class":
> >> "io.confluent.connect.jdbc.JdbcSourceConnector",
> >>>  "connection.url": "jdbc:mysql://localhost:3306/sample",
> >>>
> >>>  "connection.user": "",
> >>>
> >>>  "connection.password": "xxx",
> >>>
> >>>  "topic.prefix": "dample-",
> >>>
> >>>  "poll.interval.ms": 360,
> >>>
> >>>  "table.whitelist": "sample",
> >>>
> >>>  "schemas.enable": "false",
> >>>
> >>>  "mode": "bulk",
> >>>
> >>>  "value.converter.schemas.enable": "false",
> >>>
> >>>  "numeric.mapping": "best_fit",
> >>>
> >>>  "value.converter":
> "org.apache.kafka.connect.json.JsonConverter",
> >>>
> >>>  "transforms": "createKey,extractInt",
> >>>
> >>>  "transforms.createKey.type":
> >>> "org.apache.kafka.connect.transforms.ValueToKey",
> >>>
> >>>  "transforms.createKey.fields": "ID",
> >>>
> >>>  "transforms.extractInt.type":
> >>> "org.apache.kafka.connect.transforms.ExtractField$Key",
> >>>
> >>>  "transforms.extractInt.field": "ID"
> >>>
> >>>  }
> >>>
> >>> }
> >>>
>


Re: Problem in reading From JDBC SOURCE

2020-07-02 Thread Ricardo Ferreira

Vishnu,

I think is hard to troubleshoot things without the proper context. In 
your case, could you please share an example of the rows contained in 
the table `sample`? As well as its DDL?


-- Ricardo

On 7/2/20 9:29 AM, vishnu murali wrote:

I go through that documentation

Where it described like DECIMAL is not supported in MySQL  like this .

And also no example for MySQL so is there any other sample with MySQL



On Thu, Jul 2, 2020, 18:49 Robin Moffatt  wrote:


Check out this article where it covers decimal handling:

https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics


--

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Thu, 2 Jul 2020 at 13:54, vishnu murali 
wrote:


Hi Guys,

I am having some problem while reading from MySQL using JDBC source and
received like below
Anyone know what is the reason and how to solve this ?

"a": "Aote",

   "b": "AmrU",

   "c": "AceM",

   "d": "Aote",


Instead of

"a": 0.002,

   "b": 0.465,

   "c": 0.545,

   "d": 0.100


It's my configuration


{

 "name": "sample",

 "config": {

 "connector.class":

"io.confluent.connect.jdbc.JdbcSourceConnector",

 "connection.url": "jdbc:mysql://localhost:3306/sample",

 "connection.user": "",

 "connection.password": "xxx",

 "topic.prefix": "dample-",

 "poll.interval.ms": 360,

 "table.whitelist": "sample",

 "schemas.enable": "false",

 "mode": "bulk",

 "value.converter.schemas.enable": "false",

 "numeric.mapping": "best_fit",

 "value.converter": "org.apache.kafka.connect.json.JsonConverter",

 "transforms": "createKey,extractInt",

 "transforms.createKey.type":
"org.apache.kafka.connect.transforms.ValueToKey",

 "transforms.createKey.fields": "ID",

 "transforms.extractInt.type":
"org.apache.kafka.connect.transforms.ExtractField$Key",

 "transforms.extractInt.field": "ID"

 }

}



Re: Problem in reading From JDBC SOURCE

2020-07-02 Thread vishnu murali
I go through that documentation

Where it described like DECIMAL is not supported in MySQL  like this .

And also no example for MySQL so is there any other sample with MySQL



On Thu, Jul 2, 2020, 18:49 Robin Moffatt  wrote:

> Check out this article where it covers decimal handling:
>
> https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Thu, 2 Jul 2020 at 13:54, vishnu murali 
> wrote:
>
> > Hi Guys,
> >
> > I am having some problem while reading from MySQL using JDBC source and
> > received like below
> > Anyone know what is the reason and how to solve this ?
> >
> > "a": "Aote",
> >
> >   "b": "AmrU",
> >
> >   "c": "AceM",
> >
> >   "d": "Aote",
> >
> >
> > Instead of
> >
> > "a": 0.002,
> >
> >   "b": 0.465,
> >
> >   "c": 0.545,
> >
> >   "d": 0.100
> >
> >
> > It's my configuration
> >
> >
> > {
> >
> > "name": "sample",
> >
> > "config": {
> >
> > "connector.class":
> "io.confluent.connect.jdbc.JdbcSourceConnector",
> >
> > "connection.url": "jdbc:mysql://localhost:3306/sample",
> >
> > "connection.user": "",
> >
> > "connection.password": "xxx",
> >
> > "topic.prefix": "dample-",
> >
> > "poll.interval.ms": 360,
> >
> > "table.whitelist": "sample",
> >
> > "schemas.enable": "false",
> >
> > "mode": "bulk",
> >
> > "value.converter.schemas.enable": "false",
> >
> > "numeric.mapping": "best_fit",
> >
> > "value.converter": "org.apache.kafka.connect.json.JsonConverter",
> >
> > "transforms": "createKey,extractInt",
> >
> > "transforms.createKey.type":
> > "org.apache.kafka.connect.transforms.ValueToKey",
> >
> > "transforms.createKey.fields": "ID",
> >
> > "transforms.extractInt.type":
> > "org.apache.kafka.connect.transforms.ExtractField$Key",
> >
> > "transforms.extractInt.field": "ID"
> >
> > }
> >
> > }
> >
>


Re: Problem in reading From JDBC SOURCE

2020-07-02 Thread Robin Moffatt
Check out this article where it covers decimal handling:
https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Thu, 2 Jul 2020 at 13:54, vishnu murali 
wrote:

> Hi Guys,
>
> I am having some problem while reading from MySQL using JDBC source and
> received like below
> Anyone know what is the reason and how to solve this ?
>
> "a": "Aote",
>
>   "b": "AmrU",
>
>   "c": "AceM",
>
>   "d": "Aote",
>
>
> Instead of
>
> "a": 0.002,
>
>   "b": 0.465,
>
>   "c": 0.545,
>
>   "d": 0.100
>
>
> It's my configuration
>
>
> {
>
> "name": "sample",
>
> "config": {
>
> "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
>
> "connection.url": "jdbc:mysql://localhost:3306/sample",
>
> "connection.user": "",
>
> "connection.password": "xxx",
>
> "topic.prefix": "dample-",
>
> "poll.interval.ms": 360,
>
> "table.whitelist": "sample",
>
> "schemas.enable": "false",
>
> "mode": "bulk",
>
> "value.converter.schemas.enable": "false",
>
> "numeric.mapping": "best_fit",
>
> "value.converter": "org.apache.kafka.connect.json.JsonConverter",
>
> "transforms": "createKey,extractInt",
>
> "transforms.createKey.type":
> "org.apache.kafka.connect.transforms.ValueToKey",
>
> "transforms.createKey.fields": "ID",
>
> "transforms.extractInt.type":
> "org.apache.kafka.connect.transforms.ExtractField$Key",
>
> "transforms.extractInt.field": "ID"
>
> }
>
> }
>


Problem in reading From JDBC SOURCE

2020-07-02 Thread vishnu murali
Hi Guys,

I am having some problem while reading from MySQL using JDBC source and
received like below
Anyone know what is the reason and how to solve this ?

"a": "Aote",

  "b": "AmrU",

  "c": "AceM",

  "d": "Aote",


Instead of

"a": 0.002,

  "b": 0.465,

  "c": 0.545,

  "d": 0.100


It's my configuration


{

"name": "sample",

"config": {

"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",

"connection.url": "jdbc:mysql://localhost:3306/sample",

"connection.user": "",

"connection.password": "xxx",

"topic.prefix": "dample-",

"poll.interval.ms": 360,

"table.whitelist": "sample",

"schemas.enable": "false",

"mode": "bulk",

"value.converter.schemas.enable": "false",

"numeric.mapping": "best_fit",

"value.converter": "org.apache.kafka.connect.json.JsonConverter",

"transforms": "createKey,extractInt",

"transforms.createKey.type":
"org.apache.kafka.connect.transforms.ValueToKey",

"transforms.createKey.fields": "ID",

"transforms.extractInt.type":
"org.apache.kafka.connect.transforms.ExtractField$Key",

"transforms.extractInt.field": "ID"

}

}