Hi, 我会找个hive 2.1.1的环境来复现一下这个问题。不过首先要说明的是,目前flink不支持hive的ACID表,即使你这个例子的数据写成功了也不满足ACID的语义,在hive那边可能也读不了。
On Thu, Dec 3, 2020 at 5:23 PM yang xu <[email protected]> wrote: > Hi Rui Li > lib 下包如下: > flink-csv-1.11.2.jar > flink-dist_2.11-1.11.2.jar > flink-json-1.11.2.jar > flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar > flink-shaded-zookeeper-3.4.14.jar > flink-sql-connector-hive-2.2.0_2.11-1.11.2.jar > flink-table_2.11-1.11.2.jar > flink-table-api-java-bridge_2.11-1.11.2.jar > flink-table-blink_2.11-1.11.2.jar > flink-table-planner-blink_2.11-1.11.2.jar > log4j-1.2-api-2.12.1.jar > log4j-api-2.12.1.jar > log4j-core-2.12.1.jar > log4j-slf4j-impl-2.12.1.jar > > 写hive的语句就是简单的insert: > insert into hive_t1 SELECT address FROM users > > 另外建表语句如下: > create table hive_t1(address string) > clustered by (address) into 8 buckets > stored as orc TBLPROPERTIES ('transactional'='true','orc.compress' = > 'SNAPPY'); > > 非常感谢你的解答! > > > > > -- > Sent from: http://apache-flink.147419.n8.nabble.com/ > -- Best regards! Rui Li
