Check out this article where it covers decimal handling:
https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Thu, 2 Jul 2020 at 13:54, vishnu murali <vishnumurali9...@gmail.com>
wrote:

> Hi Guys,
>
> I am having some problem while reading from MySQL using JDBC source and
> received like below
> Anyone know what is the reason and how to solve this ?
>
> "a": "Aote",
>
>   "b": "AmrU",
>
>   "c": "AceM",
>
>   "d": "Aote",
>
>
> Instead of
>
> "a": 0.002,
>
>   "b": 0.465,
>
>   "c": 0.545,
>
>   "d": 0.100
>
>
> It's my configuration
>
>
> {
>
>     "name": "sample",
>
>     "config": {
>
>         "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
>
>         "connection.url": "jdbc:mysql://localhost:3306/sample",
>
>         "connection.user": "xxxx",
>
>         "connection.password": "xxx",
>
>         "topic.prefix": "dample-",
>
>         "poll.interval.ms": 3600000,
>
>         "table.whitelist": "sample",
>
>         "schemas.enable": "false",
>
>         "mode": "bulk",
>
>         "value.converter.schemas.enable": "false",
>
>         "numeric.mapping": "best_fit",
>
>         "value.converter": "org.apache.kafka.connect.json.JsonConverter",
>
>         "transforms": "createKey,extractInt",
>
>         "transforms.createKey.type":
> "org.apache.kafka.connect.transforms.ValueToKey",
>
>         "transforms.createKey.fields": "ID",
>
>         "transforms.extractInt.type":
> "org.apache.kafka.connect.transforms.ExtractField$Key",
>
>         "transforms.extractInt.field": "ID"
>
>     }
>
> }
>

Reply via email to