Hi Aissa,
You can easily do this by using Flink SQL, you can define a kafka table
using Flink DDL:
CREATE TABLE sensor_logs (
`date` STRING,
`main` ROW<
`ph` DOUBLE,
`whc` DOUBLE,
`temperature` DOUBLE,
`humidity` DOUBLE>,
`id` BIGINT,
`coord` ROW<
Hello,
Please can you share with me, some demos or examples of deserialization
with flink.
I need to consume some kafka message produced by sensors in JSON format.
here is my JSON message :
{"date": "2018-05-31 15:10", "main": {"ph": 5.0, "whc": 60.0,
"temperature": 9.5, "humidity": 96}, "id": 2582