lirui-apache commented on a change in pull request #12439:
URL: https://github.com/apache/flink/pull/12439#discussion_r435861584



##########
File path: docs/dev/table/hive/hive_dialect.md
##########
@@ -0,0 +1,347 @@
+---
+title: "Hive Dialect"
+nav-parent_id: hive_tableapi
+nav-pos: 1
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+Starting from 1.11.0, Flink allows users to write SQL statements in Hive 
syntax when Hive dialect
+is used. By providing compatibility with Hive syntax, we aim to improve the 
interoperability with
+Hive and reduce the scenarios when users need to switch between Flink and Hive 
in order to execute
+different statements.
+
+* This will be replaced by the TOC
+{:toc}
+
+## Use Hive Dialect
+
+Flink currently supports two SQL dialects: `default` and `hive`. You need to 
switch to Hive dialect
+before you can write in Hive syntax. The following describes how to set 
dialect with
+SQL Client and Table API. Also notice that you can dynamically switch dialect 
for each
+statement you execute. There's no need to restart a session to use a different 
dialect.
+
+### SQL Client
+
+SQL dialect can be specified via the `table.sql-dialect` property. Therefore 
you can set the initial dialect to use in
+the `configuration` section of the yaml file for your SQL Client.
+
+{% highlight yaml %}
+
+execution:
+  planner: blink
+  type: batch
+  result-mode: table
+
+configuration:
+  table.sql-dialect: hive
+  
+{% endhighlight %}
+
+You can also set the dialect after the SQL Client has launched.
+
+{% highlight bash %}
+
+Flink SQL> set table.sql-dialect=hive; -- to use hive dialect
+[INFO] Session property has been set.
+
+Flink SQL> set table.sql-dialect=default; -- to use default dialect
+[INFO] Session property has been set.
+
+{% endhighlight %}
+
+### Table API
+
+You can set dialect for your TableEnvironment with Table API.
+
+{% highlight java %}
+
+EnvironmentSettings settings = 
EnvironmentSettings.newInstance().useBlinkPlanner()...build();
+TableEnvironment tableEnv = TableEnvironment.create(settings);
+// to use hive dialect
+tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
+// to use default dialect
+tableEnv.getConfig().setSqlDialect(SqlDialect.DEFAULT);
+
+{% endhighlight %}
+
+## DDL
+
+This section lists the supported DDLs with the Hive dialect. We'll mainly 
focus on the syntax
+here. You can refer to [Hive 
doc](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL)
+for the semantics of each DDL statement.
+
+### DATABASE
+
+#### Show
+
+{% highlight sql %}
+SHOW DATABASES;
+{% endhighlight %}
+
+#### Create
+
+{% highlight sql %}
+CREATE (DATABASE|SCHEMA) [IF NOT EXISTS] database_name
+  [COMMENT database_comment]
+  [LOCATION fs_path]
+  [WITH DBPROPERTIES (property_name=property_value, ...)];
+{% endhighlight %}
+
+#### Alter
+
+##### Update Properties
+
+{% highlight sql %}
+ALTER (DATABASE|SCHEMA) database_name SET DBPROPERTIES 
(property_name=property_value, ...);
+{% endhighlight %}
+
+##### Update Owner
+
+{% highlight sql %}
+ALTER (DATABASE|SCHEMA) database_name SET OWNER [USER|ROLE] user_or_role;
+{% endhighlight %}
+
+##### Update Location
+
+{% highlight sql %}
+ALTER (DATABASE|SCHEMA) database_name SET LOCATION fs_path;
+{% endhighlight %}
+
+#### Drop
+
+{% highlight sql %}
+DROP (DATABASE|SCHEMA) [IF EXISTS] database_name [RESTRICT|CASCADE];
+{% endhighlight %}
+
+#### Use
+
+{% highlight sql %}
+USE database_name;
+{% endhighlight %}
+
+### TABLE
+
+#### Show
+
+{% highlight sql %}
+SHOW TABLES;
+{% endhighlight %}
+
+#### Create
+
+{% highlight sql %}
+CREATE [EXTERNAL] TABLE [IF NOT EXISTS] table_name
+  [(col_name data_type [column_constraint] [COMMENT col_comment], ... 
[table_constraint])]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
+  [
+    [ROW FORMAT row_format]
+    [STORED AS file_format]
+  ]
+  [LOCATION fs_path]
+  [TBLPROPERTIES (property_name=property_value, ...)]
+  
+row_format:
+  : DELIMITED [FIELDS TERMINATED BY char [ESCAPED BY char]] [COLLECTION ITEMS 
TERMINATED BY char]
+      [MAP KEYS TERMINATED BY char] [LINES TERMINATED BY char]
+      [NULL DEFINED AS char]
+  | SERDE serde_name [WITH SERDEPROPERTIES (property_name=property_value, ...)]
+  
+file_format:
+  : SEQUENCEFILE
+  | TEXTFILE
+  | RCFILE
+  | ORC
+  | PARQUET
+  | AVRO
+  | INPUTFORMAT input_format_classname OUTPUTFORMAT output_format_classname
+  
+column_constraint:
+  : NOT NULL [[ENABLE|DISABLE] [VALIDATE|NOVALIDATE] [RELY|NORELY]]
+  
+table_constraint:
+  : [CONSTRAINT constraint_name] PRIMARY KEY (col_name, ...) [[ENABLE|DISABLE] 
[VALIDATE|NOVALIDATE] [RELY|NORELY]]

Review comment:
       We'll need a lot of examples to cover all the use cases. Besides, since 
we follow Hive syntax, the examples we add will just be duplicates of what's 
already in Hive's own doc. So I'd prefer to only list the supported syntax 
here. Users can always refer to Hive docs for semantics and examples.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to