[
https://issues.apache.org/jira/browse/FLINK-13438?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Caizhi Weng updated FLINK-13438:
--------------------------------
Attachment: 0001-hive.patch
> Fix Hive connector with DataTypes.DATE/TIME/TIMESTAMP support
> -------------------------------------------------------------
>
> Key: FLINK-13438
> URL: https://issues.apache.org/jira/browse/FLINK-13438
> Project: Flink
> Issue Type: Sub-task
> Components: Connectors / Hive
> Reporter: Caizhi Weng
> Priority: Blocker
> Fix For: 1.9.0, 1.10.0
>
> Attachments: 0001-hive.patch
>
>
> Similar to JDBC connectors, Hive connectors communicate with Flink framework
> using TableSchema, which contains DataType. As the time data read from and
> write to Hive connectors must be java.sql.* types and the default conversion
> class of our time data types are java.time.*, we have to fix Hive connector
> with DataTypes.DATE/TIME/TIMESTAMP support.
> But currently when reading tables from Hive, the table schema is created
> using Hive's schema, so the time types in the created schema will be sql time
> type not local time type. If user specifies a local time type in the table
> schema when creating a table in Hive, he will get a different schema when
> reading it out. This is undesired.
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)