[ https://issues.apache.org/jira/browse/FLINK-3524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15217695#comment-15217695 ]
ASF GitHub Bot commented on FLINK-3524: --------------------------------------- Github user zentol commented on the pull request: https://github.com/apache/flink/pull/1834#issuecomment-203335575 @StephanEwen yes, it's only for kafka. it relies on other classes (KeyedDeserializationSchema) that are only present in the kafka module. > Provide a JSONDeserialisationSchema in the kafka connector package > ------------------------------------------------------------------ > > Key: FLINK-3524 > URL: https://issues.apache.org/jira/browse/FLINK-3524 > Project: Flink > Issue Type: Improvement > Components: Kafka Connector > Reporter: Robert Metzger > Assignee: Chesnay Schepler > Labels: starter > > (I don't want to include this into 1.0.0) > Currently, there is no standardized way of parsing JSON data from a Kafka > stream. I see a lot of users using JSON in their topics. It would make things > easier for our users to provide a serializer for them. > I suggest to use the jackson library because we have that aready as a > dependency in Flink and it allows to parse from a byte[]. > I would suggest to provide the following classes: > - JSONDeserializationSchema() > - JSONDeKeyValueSerializationSchema(bool includeMetadata) > The second variant should produce a record like this: > {code} > {"key": "keydata", > "value": "valuedata", > "metadata": {"offset": 123, "topic": "<topic>", "partition": 2 } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)