This is an automated email from the ASF dual-hosted git repository.

lirui pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.14 by this push:
     new d3e5553  [FLINK-23897][hive][doc] Fix obsolete doc about creating hive 
table with flink dialect
d3e5553 is described below

commit d3e55535417215fa47f8c1e283d88e87bc1fa508
Author: Rui Li <[email protected]>
AuthorDate: Fri Aug 20 17:47:32 2021 +0800

    [FLINK-23897][hive][doc] Fix obsolete doc about creating hive table with 
flink dialect
    
    This closes #16907
---
 .../docs/connectors/table/hive/hive_catalog.md     | 50 ++--------------------
 .../docs/connectors/table/hive/overview.md         |  2 +-
 .../docs/connectors/table/hive/hive_catalog.md     | 50 ++--------------------
 3 files changed, 9 insertions(+), 93 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/hive/hive_catalog.md 
b/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
index e1be3da..2846990 100644
--- a/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
+++ b/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
@@ -64,12 +64,9 @@ Generic tables, on the other hand, are specific to Flink. 
When creating generic
 HMS to persist the metadata. While these tables are visible to Hive, it's 
unlikely Hive is able to understand
 the metadata. And therefore using such tables in Hive leads to undefined 
behavior.
 
-Flink uses the property '*is_generic*' to tell whether a table is 
Hive-compatible or generic. When creating a table with
-`HiveCatalog`, it's by default considered generic. If you'd like to create a 
Hive-compatible table, make sure to set
-`is_generic` to false in your table properties.
-
-As stated above, generic tables shouldn't be used from Hive. In Hive CLI, you 
can call `DESCRIBE FORMATTED` for a table and
-decide whether it's generic or not by checking the `is_generic` property. 
Generic tables will have `is_generic=true`.
+It's recommended to switch to [Hive dialect]({{< ref 
"docs/connectors/table/hive/hive_dialect" >}}) to create Hive-compatible tables.
+If you want to create Hive-compatible tables with default dialect, make sure 
to set `'connector'='hive'` in your table properties, otherwise
+a table is considered generic by default in `HiveCatalog`. Note that the 
`connector` property is not required if you use Hive dialect.
 
 ### Example
 
@@ -206,7 +203,7 @@ root
 
 ```
 
-Verify the table is also visible to Hive via Hive Cli, and note that the table 
has property `is_generic=true`:
+Verify the table is also visible to Hive via Hive Cli:
 
 ```bash
 hive> show tables;
@@ -214,45 +211,6 @@ OK
 mykafka
 Time taken: 0.038 seconds, Fetched: 1 row(s)
 
-hive> describe formatted mykafka;
-OK
-# col_name             data_type               comment
-
-
-# Detailed Table Information
-Database:              default
-Owner:                 null
-CreateTime:            ......
-LastAccessTime:        UNKNOWN
-Retention:             0
-Location:              ......
-Table Type:            MANAGED_TABLE
-Table Parameters:
-       flink.connector.properties.bootstrap.servers    localhost:9092
-       flink.connector.topic   test
-       flink.connector.type    kafka
-       flink.connector.version universal
-       flink.format.type       csv
-       flink.generic.table.schema.0.data-type  VARCHAR(2147483647)
-       flink.generic.table.schema.0.name       name
-       flink.generic.table.schema.1.data-type  INT
-       flink.generic.table.schema.1.name       age
-       flink.update-mode       append
-       is_generic              true
-       transient_lastDdlTime   ......
-
-# Storage Information
-SerDe Library:         org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
-InputFormat:           org.apache.hadoop.mapred.TextInputFormat
-OutputFormat:          org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
-Compressed:            No
-Num Buckets:           -1
-Bucket Columns:        []
-Sort Columns:          []
-Storage Desc Params:
-       serialization.format    1
-Time taken: 0.158 seconds, Fetched: 36 row(s)
-
 ```
 
 
diff --git a/docs/content.zh/docs/connectors/table/hive/overview.md 
b/docs/content.zh/docs/connectors/table/hive/overview.md
index d1af372..f3b02c1 100644
--- a/docs/content.zh/docs/connectors/table/hive/overview.md
+++ b/docs/content.zh/docs/connectors/table/hive/overview.md
@@ -448,7 +448,7 @@ USE CATALOG myhive;
 
 ## DDL
 
-即将支持在 Flink 中创建 Hive 表,视图,分区和函数的DDL。
+在 Flink 中执行 DDL 操作 Hive 的表、视图、分区、函数等元数据时,建议使用 [Hive 方言]({{< ref 
"docs/connectors/table/hive/hive_dialect" >}})
 
 ## DML
 
diff --git a/docs/content/docs/connectors/table/hive/hive_catalog.md 
b/docs/content/docs/connectors/table/hive/hive_catalog.md
index f02c4f6..42f6a6f 100644
--- a/docs/content/docs/connectors/table/hive/hive_catalog.md
+++ b/docs/content/docs/connectors/table/hive/hive_catalog.md
@@ -64,12 +64,9 @@ Generic tables, on the other hand, are specific to Flink. 
When creating generic
 HMS to persist the metadata. While these tables are visible to Hive, it's 
unlikely Hive is able to understand
 the metadata. And therefore using such tables in Hive leads to undefined 
behavior.
 
-Flink uses the property '*is_generic*' to tell whether a table is 
Hive-compatible or generic. When creating a table with
-`HiveCatalog`, it's by default considered generic. If you'd like to create a 
Hive-compatible table, make sure to set
-`is_generic` to false in your table properties.
-
-As stated above, generic tables shouldn't be used from Hive. In Hive CLI, you 
can call `DESCRIBE FORMATTED` for a table and
-decide whether it's generic or not by checking the `is_generic` property. 
Generic tables will have `is_generic=true`.
+It's recommended to switch to [Hive dialect]({{< ref 
"docs/connectors/table/hive/hive_dialect" >}}) to create Hive-compatible tables.
+If you want to create Hive-compatible tables with default dialect, make sure 
to set `'connector'='hive'` in your table properties, otherwise
+a table is considered generic by default in `HiveCatalog`. Note that the 
`connector` property is not required if you use Hive dialect.
 
 ### Example
 
@@ -206,7 +203,7 @@ root
 
 ```
 
-Verify the table is also visible to Hive via Hive Cli, and note that the table 
has property `is_generic=true`:
+Verify the table is also visible to Hive via Hive Cli:
 
 ```bash
 hive> show tables;
@@ -214,45 +211,6 @@ OK
 mykafka
 Time taken: 0.038 seconds, Fetched: 1 row(s)
 
-hive> describe formatted mykafka;
-OK
-# col_name             data_type               comment
-
-
-# Detailed Table Information
-Database:              default
-Owner:                 null
-CreateTime:            ......
-LastAccessTime:        UNKNOWN
-Retention:             0
-Location:              ......
-Table Type:            MANAGED_TABLE
-Table Parameters:
-       flink.connector.properties.bootstrap.servers    localhost:9092
-       flink.connector.topic   test
-       flink.connector.type    kafka
-       flink.connector.version universal
-       flink.format.type       csv
-       flink.generic.table.schema.0.data-type  VARCHAR(2147483647)
-       flink.generic.table.schema.0.name       name
-       flink.generic.table.schema.1.data-type  INT
-       flink.generic.table.schema.1.name       age
-       flink.update-mode       append
-       is_generic              true
-       transient_lastDdlTime   ......
-
-# Storage Information
-SerDe Library:         org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
-InputFormat:           org.apache.hadoop.mapred.TextInputFormat
-OutputFormat:          org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
-Compressed:            No
-Num Buckets:           -1
-Bucket Columns:        []
-Sort Columns:          []
-Storage Desc Params:
-       serialization.format    1
-Time taken: 0.158 seconds, Fetched: 36 row(s)
-
 ```
 
 

Reply via email to