luoyuxia commented on code in PR #20719:
URL: https://github.com/apache/flink/pull/20719#discussion_r959326048


##########
docs/content/docs/dev/table/sql-gateway/overview.md:
##########
@@ -0,0 +1,225 @@
+---
+title: Overview
+weight: 1
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+Introduction
+----------------
+
+The SQL Gateway is a service that enables multiple clients from the remote to 
execute SQL in concurrency. It provides 
+an easy way to submit the Flink Job, look up the metadata, and analyze the 
data online.
+
+The SQL Gateway is composed of a pluggable Endpoint and the SqlGatewayService. 
The SqlGatewayService is a processor that is 
+reused by the endpoints to handle the requests. The endpoint is an entry point 
that allows users to connect. Depending on the 
+type of the endpoints, users can be using different utils to connect.
+
+{{< img width="40%" src="/fig/sql-gateway-architecture.png" alt="SQL Gateway 
Architecture" >}}
+
+Getting Started
+---------------
+
+This section describes how to setup and run your first Flink SQL program from 
the command-line.
+
+The SQL Gateway is bundled in the regular Flink distribution and thus runnable 
out-of-the-box. It requires only a running Flink cluster where table programs 
can be executed. For more information about setting up a Flink cluster see the 
[Cluster & Deployment]({{< ref 
"docs/deployment/resource-providers/standalone/overview" >}}) part. If you 
simply want to try out the SQL Client, you can also start a local cluster with 
one worker using the following command:
+
+```bash
+./bin/start-cluster.sh
+```
+### Starting the SQL Gateway
+
+The SQL Gateway scripts are also located in the binary directory of Flink. 
Users can start by calling:
+
+```bash
+./bin/sql-gateway.sh
+```
+
+By default, SQL Gateway is bundled with the REST endpoint. You can use the 
curl command to check whether the REST endpoint is available.
+
+```bash
+curl http://localhost:8083/v1/info
+{"productName":"Apache Flink","version":"1.16-SNAPSHOT"}%
+```
+
+### Running SQL Queries
+
+For validating your setup and cluster connection, you can work with following 
steps.
+
+**Step 1: Open a session**
+
+```bash
+curl --request POST http://localhost:8083/v1/sessions
+
+{"sessionHandle":"f120b289-7241-420d-aac7-aff774ea6f0e"}%
+```
+
+**Step 2: Execute a query**
+
+```bash
+curl --request POST 
http://localhost:8083/v1/sessions/f120b289-7241-420d-aac7-aff774ea6f0e/statements/
 \
+--data '{"statement": "SELECT 1"}'
+
+{"operationHandle":"f2780994-2709-4bed-a1e2-5e7f65ae7794"}%
+```
+
+**Step 3: Fetch results**
+```bash
+curl --request GET 
http://localhost:8083/v1/sessions/f120b289-7241-420d-aac7-aff774ea6f0e/operations/778ac309-8ce7-4c14-8cd6-6951fa2f9897/result/0
+
+{
+  "results": {
+    "columns": [
+      {
+        "name": "EXPR$0",
+        "logicalType": {
+          "type": "INTEGER",
+          "nullable": false
+        }
+      }
+    ],
+    "data": [
+      {
+        "kind": "INSERT",
+        "fields": [
+          1
+        ]
+      }
+    ]
+  },
+  "resultType": "PAYLOAD",
+  "nextResultUri": 
"/v1/sessions/f120b289-7241-420d-aac7-aff774ea6f0e/operations/778ac309-8ce7-4c14-8cd6-6951fa2f9897/result/1"
+}%
+```
+
+Configuration
+----------------
+
+### SQL Gateway startup options
+
+Currently, the SQL Gateway script has the following optional commands. They 
are discussed in details in the subsequent paragraphs.
+
+```bash
+./bin/sql-gateway.sh --help
+
+Usage: sql-gateway.sh [start|start-foreground|stop|stop-all] [args]
+  commands:
+    start               - Run a SQL Gateway as a daemon
+    start-foreground    - Run a SQL Gateway as a console application
+    stop                - Stop the SQL Gateway daemon
+    stop-all            - Stop all the SQL Gateway daemons
+    -h | --help         - Show this help message
+```
+
+For "start" or "start-foreground" command,  you are able to configure the SQL 
Gateway in the CLI.
+
+```bash
+./bin/sql-gateway.sh start --help
+
+Start the Flink SQL Gateway as a daemon to submit Flink SQL.
+
+  Syntax: start [OPTIONS]
+     -D <property=value>   Use value for given property
+     -h,--help             Show the help message with descriptions of all
+                           options.
+```
+
+### SQL Gateway Configuration
+
+You can configure the SQL Gateway when starting the SQL Gateway below, or any 
valid [Flink configuration]({{< ref "docs/dev/table/config" >}}) entry:
+
+```bash
+./sql-gateway -Dkey=value
+```
+
+<table class="configuration table table-bordered">
+    <thead>
+        <tr>
+            <th class="text-left" style="width: 20%">Key</th>
+            <th class="text-left" style="width: 15%">Default</th>
+            <th class="text-left" style="width: 10%">Type</th>
+            <th class="text-left" style="width: 55%">Description</th>
+        </tr>
+    </thead>
+    <tbody>
+        <tr>
+            <td><h5>sql-gateway.session.check-interval</h5></td>
+            <td style="word-wrap: break-word;">1 min</td>
+            <td>Duration</td>
+            <td>The check interval for idle session timeout, which can be 
disabled by setting to zero or negative value.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.session.idle-timeout</h5></td>
+            <td style="word-wrap: break-word;">10 min</td>
+            <td>Duration</td>
+            <td>Timeout interval for closing the session when the session 
hasn't been accessed during the interval. If setting to zero or negative value, 
the session will not be closed.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.session.max-num</h5></td>
+            <td style="word-wrap: break-word;">1000000</td>
+            <td>Integer</td>
+            <td>The maximum number of the active session for sql gateway 
service.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.worker.keepalive-time</h5></td>
+            <td style="word-wrap: break-word;">5 min</td>
+            <td>Duration</td>
+            <td>Keepalive time for an idle worker thread. When the number of 
workers exceeds min workers, excessive threads are killed after this time 
interval.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.worker.threads.max</h5></td>
+            <td style="word-wrap: break-word;">500</td>
+            <td>Integer</td>
+            <td>The maximum number of worker threads for sql gateway 
service.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.worker.threads.min</h5></td>
+            <td style="word-wrap: break-word;">5</td>
+            <td>Integer</td>
+            <td>The minimum number of worker threads for sql gateway 
service.</td>
+        </tr>
+    </tbody>
+</table>
+
+Supported Endpoints
+----------------
+
+Flink natively support REST and [HiveServer2]({{< ref 
"docs/dev/table/sql-gateway/hiveserver2" >}}) endpoints. The SQL Gateway is 
bundled with the REST endpoint by default.
+With the flexible architecture users are able to start the SQL Gateway with 
the specified endpoints by calling 

Review Comment:
   nit
   ```suggestion
   With the flexible architecture, users are able to start the SQL Gateway with 
the specified endpoints by calling 
   ```



##########
docs/content/docs/dev/table/sql-gateway/overview.md:
##########
@@ -0,0 +1,225 @@
+---
+title: Overview
+weight: 1
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+Introduction
+----------------
+
+The SQL Gateway is a service that enables multiple clients from the remote to 
execute SQL in concurrency. It provides 
+an easy way to submit the Flink Job, look up the metadata, and analyze the 
data online.
+
+The SQL Gateway is composed of a pluggable Endpoint and the SqlGatewayService. 
The SqlGatewayService is a processor that is 
+reused by the endpoints to handle the requests. The endpoint is an entry point 
that allows users to connect. Depending on the 
+type of the endpoints, users can be using different utils to connect.
+
+{{< img width="40%" src="/fig/sql-gateway-architecture.png" alt="SQL Gateway 
Architecture" >}}
+
+Getting Started
+---------------
+
+This section describes how to setup and run your first Flink SQL program from 
the command-line.
+
+The SQL Gateway is bundled in the regular Flink distribution and thus runnable 
out-of-the-box. It requires only a running Flink cluster where table programs 
can be executed. For more information about setting up a Flink cluster see the 
[Cluster & Deployment]({{< ref 
"docs/deployment/resource-providers/standalone/overview" >}}) part. If you 
simply want to try out the SQL Client, you can also start a local cluster with 
one worker using the following command:
+
+```bash
+./bin/start-cluster.sh
+```
+### Starting the SQL Gateway
+
+The SQL Gateway scripts are also located in the binary directory of Flink. 
Users can start by calling:
+
+```bash
+./bin/sql-gateway.sh

Review Comment:
   Seems use it directly doesn't work.



##########
docs/content.zh/docs/dev/table/sql-gateway/hiveserver2.md:
##########
@@ -0,0 +1,258 @@
+---
+title: HiveServer2 Endpoint
+weight: 3
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# HiveServer2 Endpoint
+
+HiveServer2 Endpoint is compatible with HiveServer2 API and allows users to 
submit SQL in Hive style.
+
+Requirements
+----------------
+Before the trip of the SQL Gateway with the HiveServer2 Endpoint, please 
prepare the required dependencies.
+
+### Dependencies
+
+To integrate with Hive, you need to add some extra dependencies to the /lib/ 
directory in Flink distribution 
+to make the dependencies available. For different Hive version, please copy 
the corresponding Hive bundled jar 
+into the /lib/directory.
+
+| Metastore version | Maven dependency                 | SQL Client JAR        
                                                                                
                                                                                
                                                                                
                                           |
+|:------------------|:---------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 2.3.0 - 2.3.9     | `flink-sql-connector-hive-2.3.9` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-2.3.9{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-2.3.9{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+| 3.0.0 - 3.1.2     | `flink-sql-connector-hive-3.1.2` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-3.1.2{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-3.1.2{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+
+Apache Hive is built on Hadoop, so you need to provide Hadoop dependencies, by 
setting the HADOOP_CLASSPATH environment variable:
+```
+export HADOOP_CLASSPATH=`hadoop classpath`
+```
+
+Currently, Hive module relies on the planner module explicitly. Therefore, you 
should copy the flink-table-planner jar that locates 
+in the flink opt directory into the /lib/ directory and move the 
flink-table-planner-loader jar that locates in the /lib/ directory 
+outside the /lib/directory.
+
+### Configure HiveServer2 Endpoint
+
+The HiveServer2 endpoint is not the default endpoint for the SQL Gateway. You 
can configure to use the HiveServer2 endpoint by calling 
+```bash
+./bin/sql-gateway.sh -Dsql-gateway.endpoint.type=hiveserver2

Review Comment:
   I think it'll be better to tell use how to specifc `hive-conf`.



##########
docs/content.zh/docs/dev/table/sql-gateway/hiveserver2.md:
##########
@@ -0,0 +1,258 @@
+---
+title: HiveServer2 Endpoint
+weight: 3
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# HiveServer2 Endpoint
+
+HiveServer2 Endpoint is compatible with HiveServer2 API and allows users to 
submit SQL in Hive style.
+
+Requirements
+----------------
+Before the trip of the SQL Gateway with the HiveServer2 Endpoint, please 
prepare the required dependencies.
+
+### Dependencies
+
+To integrate with Hive, you need to add some extra dependencies to the /lib/ 
directory in Flink distribution 
+to make the dependencies available. For different Hive version, please copy 
the corresponding Hive bundled jar 
+into the /lib/directory.
+
+| Metastore version | Maven dependency                 | SQL Client JAR        
                                                                                
                                                                                
                                                                                
                                           |

Review Comment:
   Should we replace this part with a link reference to 
`docs/connectors/table/hive/overview/#dependencies`?
   The reason is this part is not completed as in 
`docs/connectors/table/hive/overview/#dependencies`.
   Although offically, we provide `flink-sql-connector-hive-2.3.9` and 
`flink-sql-connector-hive-3.1.2`, user can still bundle the dependency to build 
for their own hive version like hive-2.2.0 or hive 1.x.



##########
docs/content.zh/docs/dev/table/sql-gateway/hiveserver2.md:
##########
@@ -0,0 +1,258 @@
+---
+title: HiveServer2 Endpoint
+weight: 3
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# HiveServer2 Endpoint
+
+HiveServer2 Endpoint is compatible with HiveServer2 API and allows users to 
submit SQL in Hive style.
+
+Requirements
+----------------
+Before the trip of the SQL Gateway with the HiveServer2 Endpoint, please 
prepare the required dependencies.
+
+### Dependencies
+
+To integrate with Hive, you need to add some extra dependencies to the /lib/ 
directory in Flink distribution 
+to make the dependencies available. For different Hive version, please copy 
the corresponding Hive bundled jar 
+into the /lib/directory.
+
+| Metastore version | Maven dependency                 | SQL Client JAR        
                                                                                
                                                                                
                                                                                
                                           |
+|:------------------|:---------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 2.3.0 - 2.3.9     | `flink-sql-connector-hive-2.3.9` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-2.3.9{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-2.3.9{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+| 3.0.0 - 3.1.2     | `flink-sql-connector-hive-3.1.2` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-3.1.2{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-3.1.2{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+
+Apache Hive is built on Hadoop, so you need to provide Hadoop dependencies, by 
setting the HADOOP_CLASSPATH environment variable:
+```
+export HADOOP_CLASSPATH=`hadoop classpath`
+```
+
+Currently, Hive module relies on the planner module explicitly. Therefore, you 
should copy the flink-table-planner jar that locates 
+in the flink opt directory into the /lib/ directory and move the 
flink-table-planner-loader jar that locates in the /lib/ directory 
+outside the /lib/directory.
+
+### Configure HiveServer2 Endpoint
+
+The HiveServer2 endpoint is not the default endpoint for the SQL Gateway. You 
can configure to use the HiveServer2 endpoint by calling 
+```bash
+./bin/sql-gateway.sh -Dsql-gateway.endpoint.type=hiveserver2

Review Comment:
   ```suggestion
   ./bin/sql-gateway.sh start -Dsql-gateway.endpoint.type=hiveserver2
   ```
   ?



##########
docs/content/docs/dev/table/sql-gateway/overview.md:
##########
@@ -0,0 +1,225 @@
+---
+title: Overview
+weight: 1
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+Introduction
+----------------
+
+The SQL Gateway is a service that enables multiple clients from the remote to 
execute SQL in concurrency. It provides 
+an easy way to submit the Flink Job, look up the metadata, and analyze the 
data online.
+
+The SQL Gateway is composed of a pluggable Endpoint and the SqlGatewayService. 
The SqlGatewayService is a processor that is 
+reused by the endpoints to handle the requests. The endpoint is an entry point 
that allows users to connect. Depending on the 
+type of the endpoints, users can be using different utils to connect.
+
+{{< img width="40%" src="/fig/sql-gateway-architecture.png" alt="SQL Gateway 
Architecture" >}}
+
+Getting Started
+---------------
+
+This section describes how to setup and run your first Flink SQL program from 
the command-line.
+
+The SQL Gateway is bundled in the regular Flink distribution and thus runnable 
out-of-the-box. It requires only a running Flink cluster where table programs 
can be executed. For more information about setting up a Flink cluster see the 
[Cluster & Deployment]({{< ref 
"docs/deployment/resource-providers/standalone/overview" >}}) part. If you 
simply want to try out the SQL Client, you can also start a local cluster with 
one worker using the following command:
+
+```bash
+./bin/start-cluster.sh
+```
+### Starting the SQL Gateway
+
+The SQL Gateway scripts are also located in the binary directory of Flink. 
Users can start by calling:
+
+```bash
+./bin/sql-gateway.sh
+```
+
+By default, SQL Gateway is bundled with the REST endpoint. You can use the 
curl command to check whether the REST endpoint is available.
+
+```bash
+curl http://localhost:8083/v1/info
+{"productName":"Apache Flink","version":"1.16-SNAPSHOT"}%
+```
+
+### Running SQL Queries
+
+For validating your setup and cluster connection, you can work with following 
steps.
+
+**Step 1: Open a session**
+
+```bash
+curl --request POST http://localhost:8083/v1/sessions
+
+{"sessionHandle":"f120b289-7241-420d-aac7-aff774ea6f0e"}%
+```
+
+**Step 2: Execute a query**
+
+```bash
+curl --request POST 
http://localhost:8083/v1/sessions/f120b289-7241-420d-aac7-aff774ea6f0e/statements/
 \
+--data '{"statement": "SELECT 1"}'
+
+{"operationHandle":"f2780994-2709-4bed-a1e2-5e7f65ae7794"}%
+```
+
+**Step 3: Fetch results**
+```bash
+curl --request GET 
http://localhost:8083/v1/sessions/f120b289-7241-420d-aac7-aff774ea6f0e/operations/778ac309-8ce7-4c14-8cd6-6951fa2f9897/result/0
+
+{
+  "results": {
+    "columns": [
+      {
+        "name": "EXPR$0",
+        "logicalType": {
+          "type": "INTEGER",
+          "nullable": false
+        }
+      }
+    ],
+    "data": [
+      {
+        "kind": "INSERT",
+        "fields": [
+          1
+        ]
+      }
+    ]
+  },
+  "resultType": "PAYLOAD",
+  "nextResultUri": 
"/v1/sessions/f120b289-7241-420d-aac7-aff774ea6f0e/operations/778ac309-8ce7-4c14-8cd6-6951fa2f9897/result/1"
+}%
+```
+
+Configuration
+----------------
+
+### SQL Gateway startup options
+
+Currently, the SQL Gateway script has the following optional commands. They 
are discussed in details in the subsequent paragraphs.
+
+```bash
+./bin/sql-gateway.sh --help
+
+Usage: sql-gateway.sh [start|start-foreground|stop|stop-all] [args]
+  commands:
+    start               - Run a SQL Gateway as a daemon
+    start-foreground    - Run a SQL Gateway as a console application
+    stop                - Stop the SQL Gateway daemon
+    stop-all            - Stop all the SQL Gateway daemons
+    -h | --help         - Show this help message
+```
+
+For "start" or "start-foreground" command,  you are able to configure the SQL 
Gateway in the CLI.
+
+```bash
+./bin/sql-gateway.sh start --help
+
+Start the Flink SQL Gateway as a daemon to submit Flink SQL.
+
+  Syntax: start [OPTIONS]
+     -D <property=value>   Use value for given property
+     -h,--help             Show the help message with descriptions of all
+                           options.
+```
+
+### SQL Gateway Configuration
+
+You can configure the SQL Gateway when starting the SQL Gateway below, or any 
valid [Flink configuration]({{< ref "docs/dev/table/config" >}}) entry:
+
+```bash
+./sql-gateway -Dkey=value
+```
+
+<table class="configuration table table-bordered">
+    <thead>
+        <tr>
+            <th class="text-left" style="width: 20%">Key</th>
+            <th class="text-left" style="width: 15%">Default</th>
+            <th class="text-left" style="width: 10%">Type</th>
+            <th class="text-left" style="width: 55%">Description</th>
+        </tr>
+    </thead>
+    <tbody>
+        <tr>
+            <td><h5>sql-gateway.session.check-interval</h5></td>
+            <td style="word-wrap: break-word;">1 min</td>
+            <td>Duration</td>
+            <td>The check interval for idle session timeout, which can be 
disabled by setting to zero or negative value.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.session.idle-timeout</h5></td>
+            <td style="word-wrap: break-word;">10 min</td>
+            <td>Duration</td>
+            <td>Timeout interval for closing the session when the session 
hasn't been accessed during the interval. If setting to zero or negative value, 
the session will not be closed.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.session.max-num</h5></td>
+            <td style="word-wrap: break-word;">1000000</td>
+            <td>Integer</td>
+            <td>The maximum number of the active session for sql gateway 
service.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.worker.keepalive-time</h5></td>
+            <td style="word-wrap: break-word;">5 min</td>
+            <td>Duration</td>
+            <td>Keepalive time for an idle worker thread. When the number of 
workers exceeds min workers, excessive threads are killed after this time 
interval.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.worker.threads.max</h5></td>
+            <td style="word-wrap: break-word;">500</td>
+            <td>Integer</td>
+            <td>The maximum number of worker threads for sql gateway 
service.</td>
+        </tr>
+        <tr>
+            <td><h5>sql-gateway.worker.threads.min</h5></td>
+            <td style="word-wrap: break-word;">5</td>
+            <td>Integer</td>
+            <td>The minimum number of worker threads for sql gateway 
service.</td>
+        </tr>
+    </tbody>
+</table>
+
+Supported Endpoints
+----------------
+
+Flink natively support REST and [HiveServer2]({{< ref 
"docs/dev/table/sql-gateway/hiveserver2" >}}) endpoints. The SQL Gateway is 
bundled with the REST endpoint by default.
+With the flexible architecture users are able to start the SQL Gateway with 
the specified endpoints by calling 
+
+```bash
+./bin/sql-gateawy.sh start -Dsql-gateway.endpoint.type=hiveserver2

Review Comment:
   ```suggestion
   ./bin/sql-gateway.sh start -Dsql-gateway.endpoint.type=hiveserver2
   ```



##########
docs/content.zh/docs/dev/table/sql-gateway/hiveserver2.md:
##########
@@ -0,0 +1,258 @@
+---
+title: HiveServer2 Endpoint
+weight: 3
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# HiveServer2 Endpoint
+
+HiveServer2 Endpoint is compatible with HiveServer2 API and allows users to 
submit SQL in Hive style.
+
+Requirements
+----------------
+Before the trip of the SQL Gateway with the HiveServer2 Endpoint, please 
prepare the required dependencies.
+
+### Dependencies
+
+To integrate with Hive, you need to add some extra dependencies to the /lib/ 
directory in Flink distribution 
+to make the dependencies available. For different Hive version, please copy 
the corresponding Hive bundled jar 
+into the /lib/directory.
+
+| Metastore version | Maven dependency                 | SQL Client JAR        
                                                                                
                                                                                
                                                                                
                                           |
+|:------------------|:---------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 2.3.0 - 2.3.9     | `flink-sql-connector-hive-2.3.9` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-2.3.9{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-2.3.9{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+| 3.0.0 - 3.1.2     | `flink-sql-connector-hive-3.1.2` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-3.1.2{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-3.1.2{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+
+Apache Hive is built on Hadoop, so you need to provide Hadoop dependencies, by 
setting the HADOOP_CLASSPATH environment variable:
+```
+export HADOOP_CLASSPATH=`hadoop classpath`
+```
+
+Currently, Hive module relies on the planner module explicitly. Therefore, you 
should copy the flink-table-planner jar that locates 

Review Comment:
   How about use
   ` swap flink-table-planner-loader located in /lib with 
flink-table-planner_2.12 located in /opt`
   as said in `docs/connectors/table/hive/hive_dialect/`?
   I think it's more concise.
   



##########
docs/content.zh/docs/dev/table/sql-gateway/hiveserver2.md:
##########
@@ -0,0 +1,258 @@
+---
+title: HiveServer2 Endpoint
+weight: 3
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# HiveServer2 Endpoint
+
+HiveServer2 Endpoint is compatible with HiveServer2 API and allows users to 
submit SQL in Hive style.
+
+Requirements
+----------------
+Before the trip of the SQL Gateway with the HiveServer2 Endpoint, please 
prepare the required dependencies.
+
+### Dependencies
+
+To integrate with Hive, you need to add some extra dependencies to the /lib/ 
directory in Flink distribution 
+to make the dependencies available. For different Hive version, please copy 
the corresponding Hive bundled jar 
+into the /lib/directory.
+
+| Metastore version | Maven dependency                 | SQL Client JAR        
                                                                                
                                                                                
                                                                                
                                           |
+|:------------------|:---------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 2.3.0 - 2.3.9     | `flink-sql-connector-hive-2.3.9` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-2.3.9{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-2.3.9{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+| 3.0.0 - 3.1.2     | `flink-sql-connector-hive-3.1.2` | {{< stable 
>}}[Download](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-hive-3.1.2{{<
 scala_version >}}/{{< version >}}/flink-sql-connector-hive-3.1.2{{< 
scala_version >}}-{{< version >}}.jar) {{< /stable >}}{{< unstable >}} Only 
available for stable releases {{< /unstable >}} |
+
+Apache Hive is built on Hadoop, so you need to provide Hadoop dependencies, by 
setting the HADOOP_CLASSPATH environment variable:
+```
+export HADOOP_CLASSPATH=`hadoop classpath`
+```
+
+Currently, Hive module relies on the planner module explicitly. Therefore, you 
should copy the flink-table-planner jar that locates 

Review Comment:
   Exactly,it's Hive dialect relies on the planner module. If user doesn't use 
Hive dialect, they don't need to move the jar `flink-table-planner`.



##########
docs/content.zh/docs/dev/table/sql-gateway/hiveserver2.md:
##########
@@ -0,0 +1,258 @@
+---
+title: HiveServer2 Endpoint
+weight: 3
+type: docs
+aliases:
+- /dev/table/sql-gateway
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# HiveServer2 Endpoint
+
+HiveServer2 Endpoint is compatible with HiveServer2 API and allows users to 
submit SQL in Hive style.
+
+Requirements
+----------------
+Before the trip of the SQL Gateway with the HiveServer2 Endpoint, please 
prepare the required dependencies.
+
+### Dependencies
+
+To integrate with Hive, you need to add some extra dependencies to the /lib/ 
directory in Flink distribution 
+to make the dependencies available. For different Hive version, please copy 
the corresponding Hive bundled jar 
+into the /lib/directory.
+
+| Metastore version | Maven dependency                 | SQL Client JAR        
                                                                                
                                                                                
                                                                                
                                           |

Review Comment:
   Should we replace this part with a link reference to 
`docs/connectors/table/hive/overview/#dependencies`?
   The reason is this part is not completed as in 
`docs/connectors/table/hive/overview/#dependencies`.
   Although offically, we provide `flink-sql-connector-hive-2.3.9` and 
`flink-sql-connector-hive-3.1.2`, user can still bundle the dependency to build 
for their own hive version like hive-2.2.0 or hive 1.x.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to