Hi!

目前 Flink SQL Hive 方言应该是没有支持指定 partial partition 然后 drop。可以提一个 jira 描述一下这个需求。

Asahi Lee <[email protected]> 于2021年8月31日周二 下午2:03写道:

> hi!
> 我是用flink 1.13.1版本,使用hive方言删除dt分区错误,同样的sql hive是可以成功的!
> Caused by:
> org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException:
> PartitionSpec CatalogPartitionSpec{{dt=2021-08-31}} does not match
> partition keys [dt, xtlx, sblx] of table test_flink.test_partition in
> catalog check_rule_base_hive_catalog.
>         at
> org.apache.flink.table.catalog.hive.HiveCatalog.getOrderedFullPartitionValues(HiveCatalog.java:1189)
> ~[flink-sql-connector-hive-2.3.6_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.catalog.hive.HiveCatalog.dropPartition(HiveCatalog.java:899)
> ~[flink-sql-connector-hive-2.3.6_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:982)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:730)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]

回复