[ 
https://issues.apache.org/jira/browse/FLINK-8538?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16358295#comment-16358295
 ] 

Timo Walther commented on FLINK-8538:
-------------------------------------

[~xccui] yes, I think this is necessary. We also have to think about how to 
handle JSON specific types. E.g. the JSON standard declares a "Number" type but 
we have to map it to some Java primitive. It may also declares union types. We 
have the following options:

Option 1: We infer the type using information from the {{TableSchema}} (but 
this would be Table API specific, formats are intended for all APIs).

Option 2: We make this configurable: number as double, number as BigDecimal etc.

Option 3: We introduce a new TypeInformation.

If we really want to support JSON once and for all, we have to think about how 
to handle those cases. I just read a discussion on the Beam ML about this:
https://lists.apache.org/thread.html/ee6843859f1ddb1d4544c32d255fe88a3bf3aec97d3afc3e3d47c701@%3Cdev.beam.apache.org%3E

> Add a Kafka table source factory with JSON format support
> ---------------------------------------------------------
>
>                 Key: FLINK-8538
>                 URL: https://issues.apache.org/jira/browse/FLINK-8538
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API & SQL
>            Reporter: Timo Walther
>            Assignee: Xingcan Cui
>            Priority: Major
>
> Similar to CSVTableSourceFactory a Kafka table source factory for JSON should 
> be added. This issue includes improving the existing JSON descriptor with 
> validation that can be used for other connectors as well. It is up for 
> discussion if we want to split the KafkaJsonTableSource into connector and 
> format such that we can reuse the format for other table sources as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to