This is an automated email from the ASF dual-hosted git repository.
martijnvisser pushed a commit to branch release-1.15
in repository https://gitbox.apache.org/repos/asf/flink.git
The following commit(s) were added to refs/heads/release-1.15 by this push:
new 7ce4be09957 [FLINK-24940][docs] Correct usage about how to create Hive
catalog via Flink SQL CLI. This closes #17829
7ce4be09957 is described below
commit 7ce4be0995740c3ae4a455ba992c48d10d48bd4c
Author: yuxia Luo <[email protected]>
AuthorDate: Thu Nov 18 20:17:01 2021 +0800
[FLINK-24940][docs] Correct usage about how to create Hive catalog via
Flink SQL CLI. This closes #17829
---
.../docs/connectors/table/hive/hive_catalog.md | 34 ++++++++++------------
.../docs/connectors/table/hive/hive_catalog.md | 34 ++++++++++------------
2 files changed, 30 insertions(+), 38 deletions(-)
diff --git a/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
b/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
index 6cc9e3041b2..dc2e461fd7f 100644
--- a/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
+++ b/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
@@ -137,28 +137,23 @@ Time taken: 0.028 seconds, Fetched: 0 row(s)
```
-#### step 2: configure Flink cluster and SQL CLI
-
-Add all Hive dependencies to `/lib` dir in Flink distribution, and modify SQL
CLI's yaml config file `sql-cli-defaults.yaml` as following:
-
-```yaml
-
-execution:
- type: streaming
- ...
- current-catalog: myhive # set the HiveCatalog as the current catalog of
the session
- current-database: mydatabase
-
-catalogs:
- - name: myhive
- type: hive
- hive-conf-dir: /opt/hive-conf # contains hive-site.xml
+#### step 2: start SQL Client, and create a Hive catalog with Flink SQL DDL
+
+Add all Hive dependencies to `/lib` dir in Flink distribution, and create a
Hive catalog in Flink SQL CLI as following:
+
+```bash
+
+Flink SQL> CREATE CATALOG myhive WITH (
+ 'type' = 'hive',
+ 'hive-conf-dir' = '/opt/hive-conf'
+);
+
```
#### step 3: set up a Kafka cluster
-Bootstrap a local Kafka 2.3.0 cluster with a topic named "test", and produce
some simple data to the topic as tuple of name and age.
+Bootstrap a local Kafka cluster with a topic named "test", and produce some
simple data to the topic as tuple of name and age.
```bash
@@ -180,11 +175,12 @@ john,21
```
-#### step 4: start SQL Client, and create a Kafka table with Flink SQL DDL
+#### step 4: create a Kafka table with Flink SQL DDL
-Start Flink SQL Client, create a simple Kafka 2.3.0 table via DDL, and verify
its schema.
+Create a simple Kafka table with Flink SQL DDL, and verify its schema.
```bash
+Flink SQL> USE CATALOG myhive;
Flink SQL> CREATE TABLE mykafka (name String, age Int) WITH (
'connector.type' = 'kafka',
diff --git a/docs/content/docs/connectors/table/hive/hive_catalog.md
b/docs/content/docs/connectors/table/hive/hive_catalog.md
index 90c1aebf057..932e18fcc0d 100644
--- a/docs/content/docs/connectors/table/hive/hive_catalog.md
+++ b/docs/content/docs/connectors/table/hive/hive_catalog.md
@@ -137,28 +137,23 @@ Time taken: 0.028 seconds, Fetched: 0 row(s)
```
-#### step 2: configure Flink cluster and SQL CLI
-
-Add all Hive dependencies to `/lib` dir in Flink distribution, and modify SQL
CLI's yaml config file `sql-cli-defaults.yaml` as following:
-
-```yaml
-
-execution:
- type: streaming
- ...
- current-catalog: myhive # set the HiveCatalog as the current catalog of
the session
- current-database: mydatabase
-
-catalogs:
- - name: myhive
- type: hive
- hive-conf-dir: /opt/hive-conf # contains hive-site.xml
+#### step 2: start SQL Client, and create a Hive catalog with Flink SQL DDL
+
+Add all Hive dependencies to `/lib` dir in Flink distribution, and create a
Hive catalog in Flink SQL CLI as following:
+
+```bash
+
+Flink SQL> CREATE CATALOG myhive WITH (
+ 'type' = 'hive',
+ 'hive-conf-dir' = '/opt/hive-conf'
+);
+
```
#### step 3: set up a Kafka cluster
-Bootstrap a local Kafka 2.3.0 cluster with a topic named "test", and produce
some simple data to the topic as tuple of name and age.
+Bootstrap a local Kafka cluster with a topic named "test", and produce some
simple data to the topic as tuple of name and age.
```bash
@@ -180,11 +175,12 @@ john,21
```
-#### step 4: start SQL Client, and create a Kafka table with Flink SQL DDL
+#### step 4: create a Kafka table with Flink SQL DDL
-Start Flink SQL Client, create a simple Kafka 2.3.0 table via DDL, and verify
its schema.
+Create a simple Kafka table with Flink SQL DDL, and verify its schema.
```bash
+Flink SQL> USE CATALOG myhive;
Flink SQL> CREATE TABLE mykafka (name String, age Int) WITH (
'connector.type' = 'kafka',