Repository: flink
Updated Branches:
  refs/heads/release-1.3 7095be5d7 -> 0f4f2bd45


[hotfix] [docs] Extend TableSink documentation.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/0f4f2bd4
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/0f4f2bd4
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/0f4f2bd4

Branch: refs/heads/release-1.3
Commit: 0f4f2bd45ed8c048bcf7b0338d44c2aafaac5a8a
Parents: 7095be5
Author: Fabian Hueske <fhue...@apache.org>
Authored: Fri Aug 11 17:33:57 2017 +0200
Committer: Fabian Hueske <fhue...@apache.org>
Committed: Fri Aug 11 17:33:57 2017 +0200

----------------------------------------------------------------------
 docs/dev/table/sourceSinks.md | 54 +++++++++++++++++++++++++++++++++++---
 1 file changed, 51 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink/blob/0f4f2bd4/docs/dev/table/sourceSinks.md
----------------------------------------------------------------------
diff --git a/docs/dev/table/sourceSinks.md b/docs/dev/table/sourceSinks.md
index 7af74ca..0431add 100644
--- a/docs/dev/table/sourceSinks.md
+++ b/docs/dev/table/sourceSinks.md
@@ -34,8 +34,6 @@ Have a look at the [common concepts and API](common.html) 
page for details how t
 Provided TableSources
 ---------------------
 
-**TODO: extend and complete**
-
 Currently, Flink provides the `CsvTableSource` to read CSV files and a few 
table sources to read JSON or Avro data from Kafka.
 A custom `TableSource` can be defined by implementing the `BatchTableSource` 
or `StreamTableSource` interface. See section on [defining a custom 
TableSource](#define-a-tablesource) for details.
 
@@ -202,7 +200,57 @@ val csvTableSource = CsvTableSource
 Provided TableSinks
 -------------------
 
-**TODO**
+The following table lists the `TableSink`s which are provided with Flink.
+
+| **Class name** | **Maven dependency** | **Batch?** | **Streaming?** | 
**Description**
+| `CsvTableSink` | `flink-table` | Y | Append | A simple sink for CSV files.
+| `JDBCAppendTableSink` | `flink-jdbc` | Y | Append | Writes tables to a JDBC 
database.
+| `Kafka08JsonTableSink` | `flink-connector-kafka-0.8` | N | Append | A Kafka 
0.8 sink with JSON encoding.
+| `Kafka09JsonTableSink` | `flink-connector-kafka-0.9` | N | Append | A Kafka 
0.9 sink with JSON encoding.
+
+All sinks that come with the `flink-table` dependency can be directly used by 
your Table programs. For all other table sinks, you have to add the respective 
dependency in addition to the `flink-table` dependency.
+
+A custom `TableSink` can be defined by implementing the `BatchTableSink`, 
`AppendStreamTableSink`, `RetractStreamTableSink`, or `UpsertStreamTableSink` 
interface. See section on [defining a custom TableSink](#define-a-tablesink) 
for details.
+
+{% top %}
+
+### CsvTableSink
+
+The `CsvTableSink` emits a `Table` to one or more CSV files. 
+
+The sink only supports append-only streaming tables. It cannot be used to emit 
a `Table` that is continuously updated. See the [documentation on Table to 
Stream conversions](./streaming.html#table-to-stream-conversion) for details. 
When emitting a streaming table, rows are written at least once (if 
checkpointing is enabled) and the `CsvTableSink` does not split output files 
into bucket files but continuously writes to the same files. 
+
+<div class="codetabs" markdown="1">
+<div data-lang="java" markdown="1">
+{% highlight java %}
+
+Table table = ...
+
+table.writeToSink(
+  new CsvTableSink(
+    path,                  // output path 
+    "|",                   // optional: delimit files by '|'
+    1,                     // optional: write to a single file
+    WriteMode.OVERWRITE)); // optional: override existing files
+
+{% endhighlight %}
+</div>
+
+<div data-lang="scala" markdown="1">
+{% highlight scala %}
+
+val table: Table = ???
+
+table.writeToSink(
+  new CsvTableSink(
+    path,                             // output path 
+    fieldDelim = "|",                 // optional: delimit files by '|'
+    numFiles = 1,                     // optional: write to a single file
+    writeMode = WriteMode.OVERWRITE)) // optional: override existing files
+
+{% endhighlight %}
+</div>
+</div>
 
 {% top %}
 

Reply via email to