[ 
https://issues.apache.org/jira/browse/FLINK-16689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17084783#comment-17084783
 ] 

xiaodao commented on FLINK-16689:
---------------------------------

Table ApI also has , now i have to modify  
org.apache.flink.formats.json.JsonNodeDeserializationSchema#deserialize and add 
Charsets ,then call ObjectMapper readValue with str;

like:

public ObjectNode deserialize(byte[] message) throws IOException {
 String megStr = new String(message, Charsets.ISO_8859_1);
 return mapper.readValue(megStr, ObjectNode.class);
}

> kafka connector as source get byte deserialize add support charsets
> -------------------------------------------------------------------
>
>                 Key: FLINK-16689
>                 URL: https://issues.apache.org/jira/browse/FLINK-16689
>             Project: Flink
>          Issue Type: Improvement
>          Components: Connectors / Kafka
>    Affects Versions: 1.10.0
>            Reporter: xiaodao
>            Priority: Minor
>
> some times kafkaProductor send record which is serialize with Specified 
> charsets eg: gbk,
> and the consumer is not support to deserialize with Specified charsets.
> just like:
> org.apache.flink.formats.json.JsonRowDeserializationSchema#deserialize



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to