[ 
https://issues.apache.org/jira/browse/FLINK-13699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jark Wu resolved FLINK-13699.
-----------------------------
    Fix Version/s: 1.9.1
       Resolution: Fixed

[FLINK-13699][table-api] Fix TableFactory doesn't work with DDL when containing 
TIMESTAMP/DATE/TIME types
 - master: b837a589f1bda5d8352e9760af39937f9194c670
 - 1.9.1:dca0879d8e992b92cff22b27fb15f598dd2c36d9

[FLINK-13699][hbase] Add integration test for HBase to verify DDL with 
TIMESTAMP types
 - master: d20175ee62cd9b3ce8912745240b57c88c5af51c
 - 1.9.1: 95ba5408833fa38aba5624be1fa88fee342cdd1b

> Fix TableFactory doesn't work with DDL when containing TIMESTAMP/DATE/TIME 
> types
> --------------------------------------------------------------------------------
>
>                 Key: FLINK-13699
>                 URL: https://issues.apache.org/jira/browse/FLINK-13699
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / API, Table SQL / Planner
>    Affects Versions: 1.9.0
>            Reporter: Jark Wu
>            Assignee: Jark Wu
>            Priority: Critical
>              Labels: pull-request-available
>             Fix For: 1.9.1
>
>          Time Spent: 10m
>  Remaining Estimate: 0h
>
> Currently, in blink planner, we will convert DDL to {{TableSchema}} with new 
> type system, i.e. DataTypes.TIMESTAMP()/DATE()/TIME() whose underlying 
> TypeInformation are  Types.LOCAL_DATETIME/LOCAL_DATE/LOCAL_TIME. 
> However, this makes the existing connector implementations (Kafka, ES, CSV, 
> etc..) don't work because they only accept the old TypeInformations 
> (Types.SQL_TIMESTAMP/SQL_DATE/SQL_TIME).
> A simple solution is encode DataTypes.TIMESTAMP() as "TIMESTAMP" when 
> translating to properties. And will be converted back to the old 
> TypeInformation: Types.SQL_TIMESTAMP. This would fix all factories at once.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

Reply via email to