This is an automated email from the ASF dual-hosted git repository. wanghailin pushed a commit to branch dev in repository https://gitbox.apache.org/repos/asf/seatunnel.git
The following commit(s) were added to refs/heads/dev by this push:
new f1601e3ea6 [Docs] fix kafka format typo (#6633)
f1601e3ea6 is described below
commit f1601e3ea662135c9aae86ace6a44aef35d5e027
Author: Jarvis <[email protected]>
AuthorDate: Sun Apr 7 19:26:59 2024 +0800
[Docs] fix kafka format typo (#6633)
---
docs/en/connector-v2/sink/Kafka.md | 26 +++++++++++++-------------
docs/en/connector-v2/source/kafka.md | 34 +++++++++++++++++-----------------
2 files changed, 30 insertions(+), 30 deletions(-)
diff --git a/docs/en/connector-v2/sink/Kafka.md
b/docs/en/connector-v2/sink/Kafka.md
index c28dd6a08e..2919eab988 100644
--- a/docs/en/connector-v2/sink/Kafka.md
+++ b/docs/en/connector-v2/sink/Kafka.md
@@ -30,19 +30,19 @@ They can be downloaded via install-plugin.sh or from the
Maven central repositor
## Sink Options
-| Name | Type | Required | Default |
Description
|
-|----------------------|--------|----------|---------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| topic | String | Yes | - | When the table is used
as sink, the topic name is the topic to write data to.
|
-| bootstrap.servers | String | Yes | - | Comma separated list of
Kafka brokers.
|
-| kafka.config | Map | No | - | In addition to the
above parameters that must be specified by the `Kafka producer` client, the
user can also specify multiple non-mandatory parameters for the `producer`
client, covering [all the producer parameters specified in the official Kafka
document](https://kafka.apache.org/documentation.html#producerconfigs).
|
-| semantics | String | No | NON | Semantics that can be
chosen EXACTLY_ONCE/AT_LEAST_ONCE/NON, default NON.
|
-| partition_key_fields | Array | No | - | Configure which fields
are used as the key of the kafka message.
|
-| partition | Int | No | - | We can specify the
partition, all messages will be sent to this partition.
|
-| assign_partitions | Array | No | - | We can decide which
partition to send based on the content of the message. The function of this
parameter is to distribute information.
|
-| transaction_prefix | String | No | - | If semantic is
specified as EXACTLY_ONCE, the producer will write all messages in a Kafka
transaction,kafka distinguishes different transactions by different
transactionId. This parameter is prefix of kafka transactionId, make sure
different job use different prefix.
|
-| format | String | No | json | Data format. The
default format is json. Optional text format, canal-json, debezium-json and
avro.If you use json or text format. The default field separator is ", ". If
you customize the delimiter, add the "field_delimiter" option.If you use canal
format, please refer to [canal-json](../formats/canal-json.md) for details.If
you use debezium format, please refer to
[debezium-json](../formats/debezium-json.md) for details. |
-| field_delimiter | String | No | , | Customize the field
delimiter for data format.
|
-| common-options | | No | - | Source plugin common
parameters, please refer to [Source Common Options](common-options.md) for
details
|
+| Name | Type | Required | Default |
Description
|
+|----------------------|--------|----------|---------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| topic | String | Yes | - | When the table is used
as sink, the topic name is the topic to write data to.
|
+| bootstrap.servers | String | Yes | - | Comma separated list of
Kafka brokers.
|
+| kafka.config | Map | No | - | In addition to the
above parameters that must be specified by the `Kafka producer` client, the
user can also specify multiple non-mandatory parameters for the `producer`
client, covering [all the producer parameters specified in the official Kafka
document](https://kafka.apache.org/documentation.html#producerconfigs).
|
+| semantics | String | No | NON | Semantics that can be
chosen EXACTLY_ONCE/AT_LEAST_ONCE/NON, default NON.
|
+| partition_key_fields | Array | No | - | Configure which fields
are used as the key of the kafka message.
|
+| partition | Int | No | - | We can specify the
partition, all messages will be sent to this partition.
|
+| assign_partitions | Array | No | - | We can decide which
partition to send based on the content of the message. The function of this
parameter is to distribute information.
|
+| transaction_prefix | String | No | - | If semantic is
specified as EXACTLY_ONCE, the producer will write all messages in a Kafka
transaction,kafka distinguishes different transactions by different
transactionId. This parameter is prefix of kafka transactionId, make sure
different job use different prefix.
|
+| format | String | No | json | Data format. The
default format is json. Optional text format, canal_json, debezium_json,
ogg_json and avro.If you use json or text format. The default field separator
is ", ". If you customize the delimiter, add the "field_delimiter" option.If
you use canal format, please refer to [canal-json](../formats/canal-json.md)
for details.If you use debezium format, please refer to
[debezium-json](../formats/debezium-json.md) for details. |
+| field_delimiter | String | No | , | Customize the field
delimiter for data format.
|
+| common-options | | No | - | Source plugin common
parameters, please refer to [Source Common Options](common-options.md) for
details
|
## Parameter Interpretation
diff --git a/docs/en/connector-v2/source/kafka.md
b/docs/en/connector-v2/source/kafka.md
index ebee2bb3d5..982c62e5fb 100644
--- a/docs/en/connector-v2/source/kafka.md
+++ b/docs/en/connector-v2/source/kafka.md
@@ -32,23 +32,23 @@ They can be downloaded via install-plugin.sh or from the
Maven central repositor
## Source Options
-| Name |
Type | Required | Default
|
Description
[...]
-|-------------------------------------|-----------------------------------------------------------------------------|----------|--------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[...]
-| topic | String
| Yes | - |
Topic name(s) to read data from when the table is used as source. It also
supports topic list for source by separating topic by comma like
'topic-1,topic-2'.
[...]
-| bootstrap.servers | String
| Yes | - |
Comma separated list of Kafka brokers.
[...]
-| pattern | Boolean
| No | false | If
`pattern` is set to `true`,the regular expression for a pattern of topic names
to read from. All topics in clients with names that match the specified regular
expression will be subscribed by the consumer.
[...]
-| consumer.group | String
| No | SeaTunnel-Consumer-Group |
`Kafka consumer group id`, used to distinguish different consumer groups.
[...]
-| commit_on_checkpoint | Boolean
| No | true | If
true the consumer's offset will be periodically committed in the background.
[...]
-| kafka.config | Map
| No | - | In
addition to the above necessary parameters that must be specified by the `Kafka
consumer` client, users can also specify multiple `consumer` client
non-mandatory parameters, covering [all consumer parameters specified in the
official Kafka
document](https://kafka.apache.org/documentation.html#consumerconfigs).
[...]
-| schema | Config
| No | - |
The structure of the data, including field names and field types.
[...]
-| format | String
| No | json |
Data format. The default format is json. Optional text format, canal-json,
debezium-json and avro.If you use json or text format. The default field
separator is ", ". If you customize the delimiter, add the "field_delimiter"
option.If you use canal format, please refer to
[canal-json](../formats/canal-json.md) for details.If you use d [...]
-| format_error_handle_way | String
| No | fail |
The processing method of data format error. The default value is fail, and the
optional value is (fail, skip). When fail is selected, data format error will
block and an exception will be thrown. When skip is selected, data format error
will skip this line data.
[...]
-| field_delimiter | String
| No | , |
Customize the field delimiter for data format.
[...]
-| start_mode |
StartMode[earliest],[group_offsets],[latest],[specific_offsets],[timestamp] |
No | group_offsets | The initial consumption pattern of
consumers.
[...]
-| start_mode.offsets | Config
| No | - |
The offset required for consumption mode to be specific_offsets.
[...]
-| start_mode.timestamp | Long
| No | - |
The time required for consumption mode to be "timestamp".
[...]
-| partition-discovery.interval-millis | Long
| No | -1 |
The interval for dynamically discovering topics and partitions.
[...]
-| common-options |
| No | - |
Source plugin common parameters, please refer to [Source Common
Options](common-options.md) for details
[...]
+| Name |
Type | Required | Default
|
Description
[...]
+|-------------------------------------|-----------------------------------------------------------------------------|----------|--------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[...]
+| topic | String
| Yes | - |
Topic name(s) to read data from when the table is used as source. It also
supports topic list for source by separating topic by comma like
'topic-1,topic-2'.
[...]
+| bootstrap.servers | String
| Yes | - |
Comma separated list of Kafka brokers.
[...]
+| pattern | Boolean
| No | false | If
`pattern` is set to `true`,the regular expression for a pattern of topic names
to read from. All topics in clients with names that match the specified regular
expression will be subscribed by the consumer.
[...]
+| consumer.group | String
| No | SeaTunnel-Consumer-Group |
`Kafka consumer group id`, used to distinguish different consumer groups.
[...]
+| commit_on_checkpoint | Boolean
| No | true | If
true the consumer's offset will be periodically committed in the background.
[...]
+| kafka.config | Map
| No | - | In
addition to the above necessary parameters that must be specified by the `Kafka
consumer` client, users can also specify multiple `consumer` client
non-mandatory parameters, covering [all consumer parameters specified in the
official Kafka
document](https://kafka.apache.org/documentation.html#consumerconfigs).
[...]
+| schema | Config
| No | - |
The structure of the data, including field names and field types.
[...]
+| format | String
| No | json |
Data format. The default format is json. Optional text format, canal_json,
debezium_json, ogg_json and avro.If you use json or text format. The default
field separator is ", ". If you customize the delimiter, add the
"field_delimiter" option.If you use canal format, please refer to
[canal-json](../formats/canal-json.md) for details.If [...]
+| format_error_handle_way | String
| No | fail |
The processing method of data format error. The default value is fail, and the
optional value is (fail, skip). When fail is selected, data format error will
block and an exception will be thrown. When skip is selected, data format error
will skip this line data.
[...]
+| field_delimiter | String
| No | , |
Customize the field delimiter for data format.
[...]
+| start_mode |
StartMode[earliest],[group_offsets],[latest],[specific_offsets],[timestamp] |
No | group_offsets | The initial consumption pattern of
consumers.
[...]
+| start_mode.offsets | Config
| No | - |
The offset required for consumption mode to be specific_offsets.
[...]
+| start_mode.timestamp | Long
| No | - |
The time required for consumption mode to be "timestamp".
[...]
+| partition-discovery.interval-millis | Long
| No | -1 |
The interval for dynamically discovering topics and partitions.
[...]
+| common-options |
| No | - |
Source plugin common parameters, please refer to [Source Common
Options](common-options.md) for details
[...]
## Task Example
