[
https://issues.apache.org/jira/browse/FLINK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16574738#comment-16574738
]
Timo Walther commented on FLINK-9964:
-------------------------------------
I think we should not mixup formats. If you read a string from a CSV format and
this string is JSON, we can offer JSON utility functions to deal with this
data. This is also discussed in Calcite right now, see CALCITE-2266.
> Add a CSV table format factory
> ------------------------------
>
> Key: FLINK-9964
> URL: https://issues.apache.org/jira/browse/FLINK-9964
> Project: Flink
> Issue Type: Sub-task
> Components: Table API & SQL
> Reporter: Timo Walther
> Assignee: buptljy
> Priority: Major
>
> We should add a RFC 4180 compliant CSV table format factory to read and write
> data into Kafka and other connectors. This requires a
> {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}. How we
> want to represent all data types and nested types is still up for discussion.
> For example, we could flatten and deflatten nested types as it is done
> [here|http://support.gnip.com/articles/json2csv.html]. We can also have a
> look how tools such as the Avro to CSV tool perform the conversion.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)