This is an automated email from the ASF dual-hosted git repository.
tyrantlucifer pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-seatunnel.git
The following commit(s) were added to refs/heads/dev by this push:
new da7490e4a [Docs][Connector-V1][Kafka] Update the docs of kafka
connector (#3297)
da7490e4a is described below
commit da7490e4a523f52c2599a38f00db90c3cac8dae7
Author: Carl-Zhou-CN <[email protected]>
AuthorDate: Sat Nov 5 11:43:24 2022 +0800
[Docs][Connector-V1][Kafka] Update the docs of kafka connector (#3297)
* [Bug] [seatunnel-connector-V1-flink-kafka] The parameter matches the
document
* [Bug] [seatunnel-connector-V1-flink-kafka] The parameter matches the
document
* [Bug] [seatunnel-connector-V1-flink-kafka] The parameter matches the
document
Co-authored-by: zhouyao <[email protected]>
---
docs/en/connector/sink/{Kafka.md => Kafka.mdx} | 87 +++++++++++++++++++++++---
1 file changed, 78 insertions(+), 9 deletions(-)
diff --git a/docs/en/connector/sink/Kafka.md b/docs/en/connector/sink/Kafka.mdx
similarity index 57%
rename from docs/en/connector/sink/Kafka.md
rename to docs/en/connector/sink/Kafka.mdx
index 0e225cf3d..350fe05d5 100644
--- a/docs/en/connector/sink/Kafka.md
+++ b/docs/en/connector/sink/Kafka.mdx
@@ -1,3 +1,6 @@
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+
# Kafka
> Kafka sink connector
@@ -17,48 +20,114 @@ Engine Supported and plugin name
## Options
+<Tabs
+ groupId="engine-type"
+ defaultValue="spark"
+ values={[
+ {label: 'Spark', value: 'spark'},
+ {label: 'Flink', value: 'flink'},
+ ]}>
+<TabItem value="spark">
+
| name | type | required | default value |
| -------------------------- | ------ | -------- | ------------- |
| producer.bootstrap.servers | string | yes | - |
| topic | string | yes | - |
| producer.* | string | no | - |
+| format | string | no | json |
+| common-options | string | no | - |
+
+</TabItem>
+<TabItem value="flink">
+
+| name | type | required | default value |
+| -------------------------- | ------ | -------- | ------------- |
+| producer.bootstrap.servers | string | yes | - |
+| topics | string | yes | - |
+| producer.* | string | no | - |
| semantic | string | no | - |
| common-options | string | no | - |
-### producer.bootstrap.servers [string]
+</TabItem>
+</Tabs>
+
+## producer.bootstrap.servers [string]
Kafka Brokers List
+### producer [string]
+
+In addition to the above parameters that must be specified by the `Kafka
producer` client, the user can also specify multiple non-mandatory parameters
for the `producer` client, covering [all the producer parameters specified in
the official Kafka
document](https://kafka.apache.org/documentation.html#producerconfigs).
+
+The way to specify the parameter is to add the prefix `producer.` to the
original parameter name. For example, the way to specify `request.timeout.ms`
is: `producer.request.timeout.ms = 60000` . If these non-essential parameters
are not specified, they will use the default values given in the official Kafka
documentation.
+
+<Tabs
+ groupId="engine-type"
+ defaultValue="spark"
+ values={[
+ {label: 'Spark', value: 'spark'},
+ {label: 'Flink', value: 'flink'},
+ ]}>
+<TabItem value="spark">
+
### topic [string]
Kafka Topic
-### producer [string]
+### format [string]
-In addition to the above parameters that must be specified by the `Kafka
producer` client, the user can also specify multiple non-mandatory parameters
for the `producer` client, covering [all the producer parameters specified in
the official Kafka
document](https://kafka.apache.org/documentation.html#producerconfigs).
+Write to kafka's data format:[text],[json]
-The way to specify the parameter is to add the prefix `producer.` to the
original parameter name. For example, the way to specify `request.timeout.ms`
is: `producer.request.timeout.ms = 60000` . If these non-essential parameters
are not specified, they will use the default values given in the official Kafka
documentation.
+</TabItem>
+<TabItem value="flink">
+
+### topics [string]
+
+Kafka Topic todo:Only one topic is supported for now
### semantic [string]
-Semantics that can be chosen. exactly_once/at_least_once/none, default is
at_least_once
+Semantics that can be chosen. exactly_once/at_least_once/none, default is
at_least_once
In exactly_once, flink producer will write all messages in a Kafka transaction
that will be committed to Kafka on a checkpoint.
-
In at_least_once, flink producer will wait for all outstanding messages in the
Kafka buffers to be acknowledged by the Kafka producer on a checkpoint.
-
NONE does not provide any guarantees: messages may be lost in case of issues
on the Kafka broker and messages may be duplicated in case of a Flink failure.
-
please refer to [Flink Kafka Fault
Tolerance](https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/datastream/kafka/#fault-tolerance)
+</TabItem>
+</Tabs>
+
+
### common options [string]
-Sink plugin common parameters, please refer to [Sink
Plugin](common-options.md) for details
+Source plugin common parameters, please refer to [Sink
Plugin](common-options.mdx) for details
## Examples
+<Tabs
+ groupId="engine-type"
+ defaultValue="spark"
+ values={[
+ {label: 'Spark', value: 'spark'},
+ {label: 'Flink', value: 'flink'},
+ ]}>
+<TabItem value="spark">
+
+```bash
+kafka {
+ topic = "seatunnel"
+ producer.bootstrap.servers = "localhost:9092"
+}
+```
+
+</TabItem>
+<TabItem value="flink">
+
```bash
kafka {
topics = "seatunnel"
producer.bootstrap.servers = "localhost:9092"
}
```
+
+</TabItem>
+</Tabs>
\ No newline at end of file