[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314722894
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,119 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
 
 Review comment:
   Removed


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314722591
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,76 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition if this table is used as a filesystem sink.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
+
+See details in [Connect to External Systems](connect.html) for all the 
supported table properties of different connectors.
+
+**Notes:** The table name can be of two formats: 1. 
`catalog_name.db_name.table_name` 2. `table_name`. For 
`catalog_name.db_name.table_name`, the table would be registered into metastore 
with catalog named "catalog_name" and database named "db_name"; for 
`table_name`, the table would be registered into the current catalog and 
database of the execution table environment.
+
+{% top %}
+
+Drop Table
+---
+{% highlight sql %}
+DROP TABLE [IF EXISTS] [catalog_name.][db_name.]table_name
+{% endhighlight %}
+
+Drop a table with the given table name. If the table to drop does not exist, 
an exception is thrown.
+
+**IF EXISTS**
+
+If the table does not exist, nothing happens.
+
+{% top %}
+
+DDL Data Types
+---
+For DDLs, we support full data types defined in page [Data Types]({{ 
site.baseurl }}/dev/table/types.html).
+
+**Notes:** Some of the data types are not supported in the sql query(the cast 
expression or literals). E.G. `STRING`, `BYTES`, `TIME(p) WITHOUT TIME ZONE`, 
`TIME(p) WITH LOCAL TIME ZONE`, `TIMESTAMP(p) WITHOUT TIME ZONE`, `TIMESTAMP(p) 
WITH LOCAL TIME ZONE`, `ARRAY`, `MULTISET`, `ROW`.
 
 Review comment:
   Okey, i would move it there.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314722168
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
 
 Review comment:
   I'm ok to remove it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314721859
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+CREATE TABLE MyUserTable (
+  `user` bigint,
 
 Review comment:
   Already use capital letters ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314718253
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+CREATE TABLE MyUserTable (
+  `user` bigint,
+  message varchar,
+  ts varchar
 
 Review comment:
   Firstly i use string but Jark said that the `STRING` is not usable now, i 
kindly agree.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314717941
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+CREATE TABLE MyUserTable (
+  `user` bigint,
+  message varchar,
+  ts varchar
+) WITH (
+  -- declare the external system to connect to
+  'connector.type' = 'kafka',
+  'connector.version' = '0.10',
+  'connector.topic' = 'topic_name',
+  'connector.startup-mode' = 'earliest-offset',
+  'connector.properties.0.key' = 'zookeeper.connect',
+  'connector.properties.0.value' = 'localhost:2181',
+  'connector.properties.0.key' = 'bootstrap.servers',
+  'connector.properties.0.value' = 'localhost:9092',
+  'update-mode' = 'append',
+  -- declare a format for this system
+  'format.type' = 'avro',
+  'format.avro-schema' = '{
+\"namespace\": \"org.myorganization\",
 
 Review comment:
   This is a quote for JSON string, not a sql identifier quote.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314663434
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -652,6 +686,34 @@ connector:
   path: "file:///path/to/whatever"# required: path to a file or directory
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a partitioned CSV table using the CREATE TABLE syntax.
+create table csv_table (
+  user bigint,
+  message string,
+  ts string
 
 Review comment:
   I don't think there is anything confusing, I put one to let users know how 
to quote the double quotes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664862
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -900,6 +1026,78 @@ connector:
 connection-path-prefix: "/v1" # optional: prefix string to be added to 
every REST communication
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a version 6 Elasticsearch table.
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  'connector.type' = 'elasticsearch', -- required: specify this table type is 
elasticsearch
+  
+  'connector.version' = '6',  -- required: valid connector versions 
are "6"
+  
+  'format.type' = 'json', -- required: specify which format to 
deserialize(as table source)
+
+  'connector.hosts.0.hostname' = 'host_name',  -- required: one or more 
Elasticsearch hosts to connect to
+  'connector.hosts.0.port' = '9092',
+  'connector.hosts.0.protocol' = 'http',
+
+  'connector.index' = 'MyUsers',   -- required: Elasticsearch index
+
+  'connector.document-type' = 'user',  -- required: Elasticsearch document type
+
+  'update-mode' = 'append',-- optional: update mode when used as 
table sink, 
+   -- only support append mode now.

+
+  'format.derive-schema' = 'true', -- optional: derive the 
serialize/deserialize format
+   -- schema from the table schema.
+
+  'format.json-schema' = '...',-- optional: specify the 
serialize/deserialize format schema,
 
 Review comment:
   Why ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314663166
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  -- declare the external system to connect to
+  'connector.type' = 'kafka',
+  'connector.version' = '0.10',
+  'update-mode' = 'append',
 
 Review comment:
   Moved


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664613
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -753,6 +815,70 @@ connector:
   sink-partitioner-class: org.mycompany.MyPartitioner  # optional: used in 
case of sink partitioner custom
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 011 version Kafka table start from the earliest offset(as table 
source)
+-- and append mode(as table sink).
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  'connector.type' = 'kafka',   
+
+  'connector.version' = '0.11', -- required: valid connector versions are
+-- "0.8", "0.9", "0.10", "0.11", and 
"universal"
+
+  'connector.topic' = 'topic_name', -- required: topic name from which the 
table is read
+
+  'update-mode' = 'append', -- required: update mode when used as 
table sink, 
+-- only support append mode now.
+
+  'format.type' = 'avro',   -- required: specify which format to 
deserialize(as table source) 
+-- and serialize(as table sink). 
+-- Valid format types are : "csv", "json", 
"avro".
+
+  'connector.properties.0.key' = 'zookeeper.connect', -- optional: connector 
specific properties
+  'connector.properties.0.value' = 'localhost:2181',
+  'connector.properties.0.key' = 'bootstrap.servers',
+  'connector.properties.0.value' = 'localhost:9092',
+  'connector.properties.0.key' = 'group.id',
+  'connector.properties.0.value' = 'testGroup',
+  'connector.startup-mode' = 'earliest-offset',  -- optional: valid modes are 
"earliest-offset", "latest-offset",
+ -- "group-offsets", or 
"specific-offsets"
+
+  'connector.specific-offsets.partition' = '0',  -- optional: used in case of 
startup mode with specific offsets
+  'connector.specific-offsets.offset' = '42',
+  'connector.specific-offsets.partition' = '1',
+  'connector.specific-offsets.offset' = '300',
+
+  'connector.sink-partitioner' = '...',  -- optional: output partitioning from 
Flink's partitions 
+ -- into Kafka's partitions valid are 
"fixed" 
+ -- (each Flink partition ends up in 
at most one Kafka partition),
+ -- "round-robin" (a Flink partition 
is distributed to 
+ -- Kafka partitions round-robin)
+ -- "custom" (use a custom 
FlinkKafkaPartitioner subclass)
+
+  'format.derive-schema' = 'true',   -- optional: derive the 
serialize/deserialize format 
 
 Review comment:
   Why ???


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664413
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -652,6 +686,34 @@ connector:
   path: "file:///path/to/whatever"# required: path to a file or directory
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a partitioned CSV table using the CREATE TABLE syntax.
+create table csv_table (
+  user bigint,
+  message string,
+  ts string
+) 
+COMMENT 'This is a csv table.' 
+PARTITIONED BY(user)
+WITH (
+  'connector.type' = 'filesystem',  -- required: specify to connector type
 
 Review comment:
   I don't think so, at lease we should give out the required properties like 
`format.type`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664520
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -753,6 +815,70 @@ connector:
   sink-partitioner-class: org.mycompany.MyPartitioner  # optional: used in 
case of sink partitioner custom
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 011 version Kafka table start from the earliest offset(as table 
source)
+-- and append mode(as table sink).
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  'connector.type' = 'kafka',   
+
+  'connector.version' = '0.11', -- required: valid connector versions are
+-- "0.8", "0.9", "0.10", "0.11", and 
"universal"
+
+  'connector.topic' = 'topic_name', -- required: topic name from which the 
table is read
+
+  'update-mode' = 'append', -- required: update mode when used as 
table sink, 
+-- only support append mode now.
+
+  'format.type' = 'avro',   -- required: specify which format to 
deserialize(as table source) 
 
 Review comment:
   Why ? it is a required option.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314666452
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,119 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
 
 Review comment:
   I kind of like the grammar form in the Calcite reference page, we don't need 
any quotes for the literal itself, i have added the notes for the format.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664985
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,119 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
 
 Review comment:
   Removed


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314662314
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+create table MyUserTable (
+  user bigint,
+  message string,
 
 Review comment:
   Changed to varchar


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314661527
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
 
 Review comment:
   Tried


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664643
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -753,6 +815,70 @@ connector:
   sink-partitioner-class: org.mycompany.MyPartitioner  # optional: used in 
case of sink partitioner custom
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 011 version Kafka table start from the earliest offset(as table 
source)
+-- and append mode(as table sink).
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  'connector.type' = 'kafka',   
+
+  'connector.version' = '0.11', -- required: valid connector versions are
+-- "0.8", "0.9", "0.10", "0.11", and 
"universal"
+
+  'connector.topic' = 'topic_name', -- required: topic name from which the 
table is read
+
+  'update-mode' = 'append', -- required: update mode when used as 
table sink, 
+-- only support append mode now.
+
+  'format.type' = 'avro',   -- required: specify which format to 
deserialize(as table source) 
+-- and serialize(as table sink). 
+-- Valid format types are : "csv", "json", 
"avro".
+
+  'connector.properties.0.key' = 'zookeeper.connect', -- optional: connector 
specific properties
+  'connector.properties.0.value' = 'localhost:2181',
+  'connector.properties.0.key' = 'bootstrap.servers',
+  'connector.properties.0.value' = 'localhost:9092',
+  'connector.properties.0.key' = 'group.id',
+  'connector.properties.0.value' = 'testGroup',
+  'connector.startup-mode' = 'earliest-offset',  -- optional: valid modes are 
"earliest-offset", "latest-offset",
+ -- "group-offsets", or 
"specific-offsets"
+
+  'connector.specific-offsets.partition' = '0',  -- optional: used in case of 
startup mode with specific offsets
+  'connector.specific-offsets.offset' = '42',
+  'connector.specific-offsets.partition' = '1',
+  'connector.specific-offsets.offset' = '300',
+
+  'connector.sink-partitioner' = '...',  -- optional: output partitioning from 
Flink's partitions 
+ -- into Kafka's partitions valid are 
"fixed" 
+ -- (each Flink partition ends up in 
at most one Kafka partition),
+ -- "round-robin" (a Flink partition 
is distributed to 
+ -- Kafka partitions round-robin)
+ -- "custom" (use a custom 
FlinkKafkaPartitioner subclass)
+
+  'format.derive-schema' = 'true',   -- optional: derive the 
serialize/deserialize format 
+ -- schema from the table schema.
+
+  'format.avro-schema' = -- optional: specify the 
serialize/deserialize format schema,
 
 Review comment:
   why ???
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664808
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -900,6 +1026,78 @@ connector:
 connection-path-prefix: "/v1" # optional: prefix string to be added to 
every REST communication
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a version 6 Elasticsearch table.
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  'connector.type' = 'elasticsearch', -- required: specify this table type is 
elasticsearch
+  
+  'connector.version' = '6',  -- required: valid connector versions 
are "6"
+  
+  'format.type' = 'json', -- required: specify which format to 
deserialize(as table source)
 
 Review comment:
   Why ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314662058
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+create table MyUserTable (
+  user bigint,
 
 Review comment:
   Yep, thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664840
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -900,6 +1026,78 @@ connector:
 connection-path-prefix: "/v1" # optional: prefix string to be added to 
every REST communication
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a version 6 Elasticsearch table.
+create table MyUserTable (
+  user bigint,
+  message string,
+  ts string
+) with (
+  'connector.type' = 'elasticsearch', -- required: specify this table type is 
elasticsearch
+  
+  'connector.version' = '6',  -- required: valid connector versions 
are "6"
+  
+  'format.type' = 'json', -- required: specify which format to 
deserialize(as table source)
+
+  'connector.hosts.0.hostname' = 'host_name',  -- required: one or more 
Elasticsearch hosts to connect to
+  'connector.hosts.0.port' = '9092',
+  'connector.hosts.0.protocol' = 'http',
+
+  'connector.index' = 'MyUsers',   -- required: Elasticsearch index
+
+  'connector.document-type' = 'user',  -- required: Elasticsearch document type
+
+  'update-mode' = 'append',-- optional: update mode when used as 
table sink, 
+   -- only support append mode now.

+
+  'format.derive-schema' = 'true', -- optional: derive the 
serialize/deserialize format
 
 Review comment:
   Why ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314664109
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -652,6 +686,34 @@ connector:
   path: "file:///path/to/whatever"# required: path to a file or directory
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a partitioned CSV table using the CREATE TABLE syntax.
+create table csv_table (
+  user bigint,
+  message string,
+  ts string
+) 
+COMMENT 'This is a csv table.' 
 
 Review comment:
   Okey


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314662011
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -276,6 +276,40 @@ tables:
 type: VARCHAR
 {% endhighlight %}
 
+
+
+{% highlight sql %}
+-- CREATE a 010 version Kafka table start from the earliest offset(as table 
source) and append mode(as table sink).
+create table MyUserTable (
 
 Review comment:
   what's the difference ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314669687
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,119 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition if this table is used as a filesystem sink.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
+
+See details in [Connect to External Systems](connect.html) for all the 
supported table properties of different connectors.
+
+**Notes:** The table name can be of two formats: 1. 
`catalog_name.db_name.table_name` 2. `table_name`. For 
`catalog_name.db_name.table_name`, the table would be registered into metastore 
with catalog named "catalog_name" and database named "db_name"; for 
`table_name`, the table would be registered into the current catalog and 
database of the execution table environment.
+
+{% top %}
+
+Drop Table
+---
+{% highlight sql %}
+DROP TABLE [IF EXISTS] [catalog_name.][db_name.]table_name
+{% endhighlight %}
+
+Drop a table with the given table name. If the table to drop does not exist, 
an exception is thrown.
+
+**IF EXISTS**
+
+If the table does not exist, nothing happens.
+
+{% top %}
+
+Create View
 
 Review comment:
   Removed


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-16 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r314667044
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,119 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition if this table is used as a filesystem sink.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
+
+See details in [Connect to External Systems](connect.html) for all the 
supported table properties of different connectors.
+
+**Notes:** The table name can be of two formats: 1. 
`catalog_name.db_name.table_name` 2. `table_name`. For 
`catalog_name.db_name.table_name`, the table would be registered into metastore 
with catalog named "catalog_name" and database named "db_name"; for 
`table_name`, the table would be registered into the current catalog and 
database of the execution table environment.
 
 Review comment:
   We won't support it in the future.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311860029
 
 

 ##
 File path: docs/dev/table/connect.md
 ##
 @@ -656,9 +656,11 @@ connector:
 
 The file system connector itself is included in Flink and does not require an 
additional dependency. A corresponding format needs to be specified for reading 
and writing rows from and to a file system.
 
+The file system connector can also be defined with a *CREATE TABLE DDL* 
statement, please see the dedicated page [DDL](ddl.html) examples.
 
 Review comment:
   Actually we still execute DDLs with  Java, Scala or Python, i suggest to 
move it in the DDL page and make a link here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311852454
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
+
+**Examples**:
+{% highlight sql %}
+-- CREATE a partitioned CSV table using the CREATE TABLE syntax.
+create table csv_table (
+  f0 int,
+  f1 bigint,
+  f2 string
+) 
+COMMENT 'This is a csv table.' 
+PARTITIONED BY(f0)
+WITH (
+  'connector.type' = 'filesystem',
+  'format.type' = 'csv',
+  'connector.path' = 'path1'
 
 Review comment:
   Yep, thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311851845
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition.
 
 Review comment:
   Updated.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311852423
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
+
+**Examples**:
 
 Review comment:
   I have tried and updated all the DDLs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311834429
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
 
 Review comment:
   We actually support omit column definitions now, just like Spark and Hive, 
it would produce a table with empty row type. I have referenced the DataTypes 
page in the bottom.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311835677
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
 
 Review comment:
   It is a dedicated page which belongs to the `SQL` page, just like the `Data 
Types` page, i have add the link for it.
   I put it in a single page because the `DDL` grammar is very different from 
the query, and we need to give description of almost every grammar block 
(optional or required).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311852353
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
 
 Review comment:
   Agree, thanks, i have added the properties link.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311835114
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
+
+If a table with the same name already exists in the database, replace it if 
this is declared. **Notes:** The OR REPLACE option is always false now.
+
+**PARTITIONED BY**
+
+Partition the created table by the specified columns. A directory is created 
for each partition.
+
+**WITH OPTIONS**
+
+Table properties used to create a table source/sink. The properties are 
usually used to find and create the underlying connector. **Notes:** the key 
and value of expression `key1=val1` should both be string literal.
+
+**Examples**:
+{% highlight sql %}
+-- CREATE a partitioned CSV table using the CREATE TABLE syntax.
+create table csv_table (
+  f0 int,
+  f1 bigint,
+  f2 string
+) 
+COMMENT 'This is a csv table.' 
+PARTITIONED BY(f0)
+WITH (
+  'connector.type' = 'filesystem',
+  'format.type' = 'csv',
+  'connector.path' = 'path1'
+  'format.fields.0.name' = 'f0',
+  'format.fields.0.type' = 'INT',
+  'format.fields.1.name' = 'f1',
+  'format.fields.1.type' = 'BIGINT',
+  'format.fields.2.name' = 'f2',
+  'format.fields.2.type' = 'STRING',
+);
+
+-- CREATE a Kafka table start from the earliest offset(as table source) and 
append mode(as table sink).
+create table kafka_table (
+  f0 int,
+  f1 bigint,
+  f2 string
+) with (
+  'connector.type' = 'kafka',
+  'update-mode' = 'append',
+  'connector.topic' = 'topic_name',
+  'connector.startup-mode' = 'earliest-offset',
+  'connector.properties.0.key' = 'props-key0',
+  'connector.properties.0.value' = 'props-val0',
+  'format.fields.0.name' = 'f0',
+  'format.fields.0.type' = 'INT',
+  'format.fields.1.name' = 'f1',
+  'format.fields.1.type' = 'BIGINT',
+  'format.fields.2.name' = 'f2',
+  'format.fields.2.type' = 'STRING'
+);
+
+-- CREATE a Elasticsearch table.
+create table kafka_table (
+  f0 int,
+  f1 bigint,
+  f2 string
+) with (
+  'connector.type' = 'elasticsearch',
+  'update-mode' = 'append',
+  'connector.hosts.0.hostname' = 'host_name',
+  'connector.hosts.0.port' = '9092',
+  'connector.hosts.0.protocal' = 'http',
+  'connector.index' = 'index_name',
+  'connector.document-type' = 'type_name',
+  'format.fields.0.name' = 'f0',
+  'format.fields.0.type' = 'INT',
+  'format.fields.1.name' = 'f1',
+  'format.fields.1.type' = 'BIGINT',
+  'format.fields.2.name' = 'f2',
+  'format.fields.2.type' = 'STRING'
+);
+{% endhighlight %}
+
+{% top %}
+
+Drop Table
+---
+{% highlight sql %}
+DROP TABLE [IF EXISTS] [catalog_name.][db_name.]table_name
+{% endhighlight %}
+
+Drop a table with the given table name. If the table to drop does not exist, 
an exception is thrown.
+
+**IF EXISTS**
+
+If the table does not exist, nothing happens.
+
+{% top %}
+
+Create View
+---
+{% highlight sql %}
+CREATE [OR REPLACE] VIEW [catalog_name.][db_name.]view_name
+[COMMENT view_comment]
+AS
+select_statement
+{% endhighlight %}
+
+Define a logical view on a sql query which may be from multiple tables or 
views.
+
+**OR REPLACE**
+
+If the view does not exist, CREATE OR REPLACE VIEW is equivalent to CREATE 
VIEW. If the view does exist, CREATE OR REPLACE VIEW is equivalent to ALTER 
VIEW. **Notes:** The OR REPLACE option is always false now.
+
+**AS select_statement**
+
+A SELECT statement that defines the view. The statement can select from base 
tables or the other views.
+
+**Examples**:
+{% highlight sql %}
+-- Create a view view_deptDetails in database1. The view definition is 
recorded in the specified catalog and database.
+CREATE VIEW catalog1.database1.view1
+  AS SELECT * FROM company JOIN dept ON company.dept_id = dept.id;
+
+-- Create or replace a view from a persistent view with an extra filter
+CREATE OR REPLACE VIEW view2
+  AS SELECT * FROM catalog1.database1.view1 WHERE loc = 'Shanghai';
+
+-- Access the base 

[GitHub] [flink] danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add documentation for DDL introduction

2019-08-07 Thread GitBox
danny0405 commented on a change in pull request #9366: [FLINK-13359][docs] Add 
documentation for DDL introduction
URL: https://github.com/apache/flink/pull/9366#discussion_r311834574
 
 

 ##
 File path: docs/dev/table/ddl.md
 ##
 @@ -0,0 +1,230 @@
+---
+title: "DDL"
+nav-parent_id: tableapi
+nav-pos: 0
+---
+
+
+The Table API and SQL are integrated in a joint API. The central concept of 
this API is a `Table` which serves as input and output of queries. This 
document shows all the DDL grammar Flink support, how to register a `Table`(or 
view) through DDL, how to drop a `Table`(or view) through DDL.
+
+* This will be replaced by the TOC
+{:toc}
+
+Create Table
+---
+{% highlight sql %}
+CREATE [OR REPLACE] TABLE [catalog_name.][db_name.]table_name
+  [(col_name1 col_type1 [COMMENT col_comment1], ...)]
+  [COMMENT table_comment]
+  [PARTITIONED BY (col_name1, col_name2, ...)]
+  [WITH (key1=val1, key2=val2, ...)]
+{% endhighlight %}
+
+Create a table with the given table properties. If a table with the same name 
already exists in the database, an exception is thrown except that *IF NOT 
EXIST* is declared.
+
+**OR REPLACE**
 
 Review comment:
   I suggest to list the grammar we supported and give the right notes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services