[ 
https://issues.apache.org/jira/browse/FLINK-19025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17184036#comment-17184036
 ] 

Rui Li commented on FLINK-19025:
--------------------------------

[~McClone] The doc is for the dependencies needed to use hive connector. If 
you're only using filesystem connector, you don't need these jars.
But as I mentioned earlier, data written by filesystem connector may not be 
compatible with hive. We only guarantee compatibility between hive and hive 
connector.
If you need hive to consume the written data, you should use hive connector to 
write it.

> table sql write orc file but hive2.1.1 can not read
> ---------------------------------------------------
>
>                 Key: FLINK-19025
>                 URL: https://issues.apache.org/jira/browse/FLINK-19025
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / ORC
>    Affects Versions: 1.11.0
>            Reporter: McClone
>            Priority: Major
>
> table sql write orc file but hive2.1.1 create external table can not read 
> data.Because flink use orc-core-1.5.6.jar but hive 2.1.1 use his own orcfile 
> jar.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to