hailin0 commented on code in PR #8724: URL: https://github.com/apache/seatunnel/pull/8724#discussion_r1961236747
##########
docs/en/connector-v2/sink/Kafka.md:
##########
@@ -30,21 +30,21 @@ They can be downloaded via install-plugin.sh or from the
Maven central repositor
## Sink Options
-| Name | Type | Required | Default | Description
|
-|-----------------------|--------|----------|---------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| topic | String | Yes | - | When the table is used
as sink, the topic name is the topic to write data to.
|
-| bootstrap.servers | String | Yes | - | Comma separated list
of Kafka brokers.
|
-| kafka.config | Map | No | - | In addition to the
above parameters that must be specified by the `Kafka producer` client, the
user can also specify multiple non-mandatory parameters for the `producer`
client, covering [all the producer parameters specified in the official Kafka
document](https://kafka.apache.org/documentation.html#producerconfigs).
|
-| semantics | String | No | NON | Semantics that can be
chosen EXACTLY_ONCE/AT_LEAST_ONCE/NON, default NON.
|
-| partition_key_fields | Array | No | - | Configure which fields
are used as the key of the kafka message.
|
-| partition | Int | No | - | We can specify the
partition, all messages will be sent to this partition.
|
-| assign_partitions | Array | No | - | We can decide which
partition to send based on the content of the message. The function of this
parameter is to distribute information.
|
-| transaction_prefix | String | No | - | If semantic is
specified as EXACTLY_ONCE, the producer will write all messages in a Kafka
transaction,kafka distinguishes different transactions by different
transactionId. This parameter is prefix of kafka transactionId, make sure
different job use different prefix.
|
-| format | String | No | json | Data format. The
default format is json. Optional text format, canal_json, debezium_json,
ogg_json and avro.If you use json or text format. The default field separator
is ", ". If you customize the delimiter, add the "field_delimiter" option.If
you use canal format, please refer to [canal-json](../formats/canal-json.md)
for details.If you use debezium format, please refer to
[debezium-json](../formats/debezium-json.md) for details. |
-| field_delimiter | String | No | , | Customize the field
delimiter for data format.
|
-| common-options | | No | - | Source plugin common
parameters, please refer to [Source Common Options](../sink-common-options.md)
for details
|
-| protobuf_message_name | String | No | - | Effective when the
format is set to protobuf, specifies the Message name
|
-| protobuf_schema | String | No | - | Effective when the
format is set to protobuf, specifies the Schema definition
|
+| Name | Type | Required | Default | Description
|
+|-----------------------|--------|----------|---------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| topic | String | Yes | - | When the table is used
as sink, the topic name is the topic to write data to.
|
+| bootstrap.servers | String | Yes | - | Comma separated list
of Kafka brokers.
|
+| kafka.config | Map | No | - | In addition to the
above parameters that must be specified by the `Kafka producer` client, the
user can also specify multiple non-mandatory parameters for the `producer`
client, covering [all the producer parameters specified in the official Kafka
document](https://kafka.apache.org/documentation.html#producerconfigs).
|
+| semantics | String | No | NON | Semantics that can be
chosen EXACTLY_ONCE/AT_LEAST_ONCE/NON, default NON.
|
+| partition_key_fields | Array | No | - | Configure which fields
are used as the key of the kafka message.
|
+| partition | Int | No | - | We can specify the
partition, all messages will be sent to this partition.
|
+| assign_partitions | Array | No | - | We can decide which
partition to send based on the content of the message. The function of this
parameter is to distribute information.
|
+| transaction_prefix | String | No | - | If semantic is
specified as EXACTLY_ONCE, the producer will write all messages in a Kafka
transaction,kafka distinguishes different transactions by different
transactionId. This parameter is prefix of kafka transactionId, make sure
different job use different prefix.
|
+| format | String | No | json | Data format. The
default format is json. Optional text format, canal_json, debezium_json,
ogg_json , avro and native.If you use json or text format. The default field
separator is ", ". If you customize the delimiter, add the "field_delimiter"
option.If you use canal format, please refer to
[canal-json](../formats/canal-json.md) for details.If you use debezium format,
please refer to [debezium-json](../formats/debezium-json.md) for details. |
Review Comment:
Add config example for native & desc input/output row filelds
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
