This is an automated email from the ASF dual-hosted git repository.

diwu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new a2dfa8392db [doc](enosystem) update kafka connector doc (#2573)
a2dfa8392db is described below

commit a2dfa8392db427e8e0ec6739e3bc44199f8a2cd7
Author: wudi <[email protected]>
AuthorDate: Mon Jul 7 09:55:18 2025 +0800

    [doc](enosystem) update kafka connector doc (#2573)
    
    ## Versions
    
    - [x] dev
    - [x] 3.0
    - [x] 2.1
    - [ ] 2.0
    
    ## Languages
    
    - [x] Chinese
    - [x] English
    
    ## Docs Checklist
    
    - [ ] Checked by AI
    - [ ] Test Cases Built
---
 docs/ecosystem/doris-kafka-connector.md                    | 14 ++++++++++++--
 .../current/ecosystem/doris-kafka-connector.md             | 14 ++++++++++++--
 .../version-2.1/ecosystem/doris-kafka-connector.md         | 14 ++++++++++++--
 .../version-3.0/ecosystem/doris-kafka-connector.md         | 14 ++++++++++++--
 .../version-2.1/ecosystem/doris-kafka-connector.md         | 14 ++++++++++++--
 .../version-3.0/ecosystem/doris-kafka-connector.md         | 14 ++++++++++++--
 6 files changed, 72 insertions(+), 12 deletions(-)

diff --git a/docs/ecosystem/doris-kafka-connector.md 
b/docs/ecosystem/doris-kafka-connector.md
index c54ed06008f..b490566affe 100644
--- a/docs/ecosystem/doris-kafka-connector.md
+++ b/docs/ecosystem/doris-kafka-connector.md
@@ -9,7 +9,16 @@
 
 The Doris community provides the 
[doris-kafka-connector](https://github.com/apache/doris-kafka-connector) 
plug-in, which can write data in the Kafka topic to Doris.
 
-## Usage Doris Kafka Connector
+## Version Description
+
+| Connector Version | Kafka Version                 | Doris Version | Java 
Version | 
+| ----------------- | ----------------------------- | ------------- | 
------------ |
+| 1.0.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 1.1.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 24.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+| 25.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+
+## Usage
 
 ### Download
 [doris-kafka-connector](https://doris.apache.org/download)
@@ -19,7 +28,7 @@ maven dependencies
 <dependency>
   <groupId>org.apache.doris</groupId>
   <artifactId>doris-kafka-connector</artifactId>
-  <version>1.0.0</version>
+  <version>25.0.0</version>
 </dependency>
 ```
 
@@ -203,6 +212,7 @@ errors.deadletterqueue.topic.replication.factor=1
 | database.time_zone          | -                                    | UTC     
                                                                             | 
N            | When `converter.mode` is not `normal` mode, it provides a way to 
specify time zone conversion for date data types (such as datetime, date, 
timestamp, etc.). The default is UTC time zone.                                 
                                                                                
                     [...]
 | avro.topic2schema.filepath  | -                                    | -       
                                                                             | 
N            | By reading the locally provided Avro Schema file, the Avro file 
content in the Topic is parsed to achieve decoupling from the Schema 
registration center provided by Confluent. <br/> This configuration needs to be 
used with the `key.converter` or `value.converter` prefix. For example, the 
local Avro Schema file for con [...]
 | record.tablename.field      | -                                    | -       
                                                                             | 
N            | Configure this parameter, data from one kafka topic can flow to 
multiple doris tables. For configuration details, refer to: 
[#58](https://github.com/apache/doris-kafka-connector/pull/58)                  
                                                                                
                                    [...]
+| enable.combine.flush | `true`,<br/> `false` | false | N | Whether to merge 
data from all partitions together and write them. The default value is false. 
When enabled, only at_least_once semantics are guaranteed.|
 
 For other Kafka Connect Sink common configuration items, please refer to: 
[connect_configuring](https://kafka.apache.org/documentation/#connect_configuring)
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/doris-kafka-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/doris-kafka-connector.md
index ba3eeeafc0d..23d3a4f694a 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/doris-kafka-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/doris-kafka-connector.md
@@ -9,7 +9,16 @@
 
 Doris 社区提供了 
[doris-kafka-connector](https://github.com/apache/doris-kafka-connector) 插件,可以将 
Kafka topic 中的数据写入到 Doris 中。
 
-## Doris Kafka Connector 使用
+## 版本说明
+
+| Connector Version | Kafka Version                 | Doris Version | Java 
Version | 
+| ----------------- | ----------------------------- | ------------- | 
------------ |
+| 1.0.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 1.1.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 24.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+| 25.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+
+## 使用方式
 
 ### 下载
 [doris-kafka-connector](https://doris.apache.org/zh-CN/download)
@@ -19,7 +28,7 @@ maven 依赖
 <dependency>
   <groupId>org.apache.doris</groupId>
   <artifactId>doris-kafka-connector</artifactId>
-  <version>1.0.0</version>
+  <version>25.0.0</version>
 </dependency>
 ```
 
@@ -204,6 +213,7 @@ errors.deadletterqueue.topic.replication.factor=1
 | database.time_zone          | -                                    | UTC     
                                                                             | 
N            | 当 `converter.mode` 为非 `normal` 模式时,对于日期数据类型(如 datetime, date, 
timestamp 等等)提供指定时区转换的方式,默认为 UTC 时区。                                            
                                                                                
                                                                                
                  [...]
 | avro.topic2schema.filepath  | -                                    | -       
                                                                             | 
N            | 通过读取本地提供的 Avro Schema 文件,来解析 Topic 中的 Avro 文件内容,实现与 Confluent 提供 
Schema 注册中心解耦。<br/> 此配置需要与 `key.converter` 或 `value.converter` 前缀一起使用,例如配置 
avro-user、avro-product Topic 的本地 Avro Schema 文件如下: 
`"value.converter.avro.topic2schema.filepath":"avro-user:file:///opt/avro_user.avsc,
 avro-product:file:///opt/avro_product.avsc" [...]
 | record.tablename.field      | -                                    | -       
                                                                             | 
N            | 开启该参数后,可实现一个 Topic 的数据流向多个 Doris 
表。配置详情参考:[#58](https://github.com/apache/doris-kafka-connector/pull/58)         
                                                                                
                                                                                
                                               [...]
+| enable.combine.flush | `true`,<br/> `false`| false | N | 
是否将所有分区的数据合并在一起写入。默认值为 false。开启后只能保证 at_least_once 语义。|
 
 其他 Kafka Connect Sink 
通用配置项可参考:[connect_configuring](https://kafka.apache.org/documentation/#connect_configuring)
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/doris-kafka-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/doris-kafka-connector.md
index 8db5714c3f7..c56299cbfd2 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/doris-kafka-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/doris-kafka-connector.md
@@ -9,7 +9,16 @@
 
 Doris 社区提供了 
[doris-kafka-connector](https://github.com/apache/doris-kafka-connector) 插件,可以将 
Kafka topic 中的数据写入到 Doris 中。
 
-## Doris Kafka Connector 使用
+## 版本说明
+
+| Connector Version | Kafka Version                 | Doris Version | Java 
Version | 
+| ----------------- | ----------------------------- | ------------- | 
------------ |
+| 1.0.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 1.1.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 24.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+| 25.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+
+## 使用方式
 
 ### 下载
 [doris-kafka-connector](https://doris.apache.org/zh-CN/download)
@@ -19,7 +28,7 @@ maven 依赖
 <dependency>
   <groupId>org.apache.doris</groupId>
   <artifactId>doris-kafka-connector</artifactId>
-  <version>1.0.0</version>
+  <version>25.0.0</version>
 </dependency>
 ```
 
@@ -204,6 +213,7 @@ errors.deadletterqueue.topic.replication.factor=1
 | database.time_zone          | -                                    | UTC     
                                                                             | 
N            | 当 `converter.mode` 为非 `normal` 模式时,对于日期数据类型(如 datetime, date, 
timestamp 等等)提供指定时区转换的方式,默认为 UTC 时区。                                            
                                                                                
                                                                                
                  [...]
 | avro.topic2schema.filepath  | -                                    | -       
                                                                             | 
N            | 通过读取本地提供的 Avro Schema 文件,来解析 Topic 中的 Avro 文件内容,实现与 Confluent 提供 
Schema 注册中心解耦。<br/> 此配置需要与 `key.converter` 或 `value.converter` 前缀一起使用,例如配置 
avro-user、avro-product Topic 的本地 Avro Schema 文件如下: 
`"value.converter.avro.topic2schema.filepath":"avro-user:file:///opt/avro_user.avsc,
 avro-product:file:///opt/avro_product.avsc" [...]
 | record.tablename.field      | -                                    | -       
                                                                             | 
N            | 开启该参数后,可实现一个 Topic 的数据流向多个 Doris 
表。配置详情参考:[#58](https://github.com/apache/doris-kafka-connector/pull/58)         
                                                                                
                                                                                
                                               [...]
+| enable.combine.flush | `true`,<br/> `false` | false | N | 
是否将所有分区的数据合并在一起写入。默认值为 false。开启后只能保证 at_least_once 语义。|
 
 其他 Kafka Connect Sink 
通用配置项可参考:[connect_configuring](https://kafka.apache.org/documentation/#connect_configuring)
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/doris-kafka-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/doris-kafka-connector.md
index f0d513874cc..6a75dfb8c61 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/doris-kafka-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/doris-kafka-connector.md
@@ -9,7 +9,16 @@
 
 Doris 社区提供了 
[doris-kafka-connector](https://github.com/apache/doris-kafka-connector) 插件,可以将 
Kafka topic 中的数据写入到 Doris 中。
 
-## Doris Kafka Connector 使用
+## 版本说明
+
+| Connector Version | Kafka Version                 | Doris Version | Java 
Version | 
+| ----------------- | ----------------------------- | ------------- | 
------------ |
+| 1.0.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 1.1.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 24.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+| 25.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+
+## 使用方式
 
 ### 下载
 [doris-kafka-connector](https://doris.apache.org/zh-CN/download)
@@ -19,7 +28,7 @@ maven 依赖
 <dependency>
   <groupId>org.apache.doris</groupId>
   <artifactId>doris-kafka-connector</artifactId>
-  <version>1.0.0</version>
+  <version>25.0.0</version>
 </dependency>
 ```
 
@@ -204,6 +213,7 @@ errors.deadletterqueue.topic.replication.factor=1
 | database.time_zone          | -                                    | UTC     
                                                                             | 
N            | 当 `converter.mode` 为非 `normal` 模式时,对于日期数据类型(如 datetime, date, 
timestamp 等等)提供指定时区转换的方式,默认为 UTC 时区。                                            
                                                                                
                                                                                
                  [...]
 | avro.topic2schema.filepath  | -                                    | -       
                                                                             | 
N            | 通过读取本地提供的 Avro Schema 文件,来解析 Topic 中的 Avro 文件内容,实现与 Confluent 提供 
Schema 注册中心解耦。<br/> 此配置需要与 `key.converter` 或 `value.converter` 前缀一起使用,例如配置 
avro-user、avro-product Topic 的本地 Avro Schema 文件如下: 
`"value.converter.avro.topic2schema.filepath":"avro-user:file:///opt/avro_user.avsc,
 avro-product:file:///opt/avro_product.avsc" [...]
 | record.tablename.field      | -                                    | -       
                                                                             | 
N            | 开启该参数后,可实现一个 Topic 的数据流向多个 Doris 
表。配置详情参考:[#58](https://github.com/apache/doris-kafka-connector/pull/58)         
                                                                                
                                                                                
                                               [...]
+| enable.combine.flush | `true`,<br/> `false` | false | N | 
是否将所有分区的数据合并在一起写入。默认值为 false。开启后只能保证 at_least_once 语义。|
 
 其他 Kafka Connect Sink 
通用配置项可参考:[connect_configuring](https://kafka.apache.org/documentation/#connect_configuring)
 
diff --git a/versioned_docs/version-2.1/ecosystem/doris-kafka-connector.md 
b/versioned_docs/version-2.1/ecosystem/doris-kafka-connector.md
index 113531c2986..94f044bdd75 100644
--- a/versioned_docs/version-2.1/ecosystem/doris-kafka-connector.md
+++ b/versioned_docs/version-2.1/ecosystem/doris-kafka-connector.md
@@ -9,7 +9,16 @@
 
 The Doris community provides the 
[doris-kafka-connector](https://github.com/apache/doris-kafka-connector) 
plug-in, which can write data in the Kafka topic to Doris.
 
-## Usage Doris Kafka Connector
+## Version Description
+
+| Connector Version | Kafka Version                 | Doris Version | Java 
Version | 
+| ----------------- | ----------------------------- | ------------- | 
------------ |
+| 1.0.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 1.1.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 24.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+| 25.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+
+## Usage
 
 ### Download
 [doris-kafka-connector](https://doris.apache.org/download)
@@ -19,7 +28,7 @@ maven dependencies
 <dependency>
   <groupId>org.apache.doris</groupId>
   <artifactId>doris-kafka-connector</artifactId>
-  <version>1.0.0</version>
+  <version>25.0.0</version>
 </dependency>
 ```
 
@@ -203,6 +212,7 @@ errors.deadletterqueue.topic.replication.factor=1
 | database.time_zone          | -                                    | UTC     
                                                                             | 
N            | When `converter.mode` is not `normal` mode, it provides a way to 
specify time zone conversion for date data types (such as datetime, date, 
timestamp, etc.). The default is UTC time zone.                                 
                                                                                
                     [...]
 | avro.topic2schema.filepath  | -                                    | -       
                                                                             | 
N            | By reading the locally provided Avro Schema file, the Avro file 
content in the Topic is parsed to achieve decoupling from the Schema 
registration center provided by Confluent. <br/> This configuration needs to be 
used with the `key.converter` or `value.converter` prefix. For example, the 
local Avro Schema file for con [...]
 | record.tablename.field      | -                                    | -       
                                                                             | 
N            | Configure this parameter, data from one kafka topic can flow to 
multiple doris tables. For configuration details, refer to: 
[#58](https://github.com/apache/doris-kafka-connector/pull/58)                  
                                                                                
                                    [...]
+| enable.combine.flush | `true`,<br/> `false` | false | N | Whether to merge 
data from all partitions together and write them. The default value is false. 
When enabled, only at_least_once semantics are guaranteed.|
 
 For other Kafka Connect Sink common configuration items, please refer to: 
[connect_configuring](https://kafka.apache.org/documentation/#connect_configuring)
 
diff --git a/versioned_docs/version-3.0/ecosystem/doris-kafka-connector.md 
b/versioned_docs/version-3.0/ecosystem/doris-kafka-connector.md
index fc8ca9f2e07..0329079e852 100644
--- a/versioned_docs/version-3.0/ecosystem/doris-kafka-connector.md
+++ b/versioned_docs/version-3.0/ecosystem/doris-kafka-connector.md
@@ -9,7 +9,16 @@
 
 The Doris community provides the 
[doris-kafka-connector](https://github.com/apache/doris-kafka-connector) 
plug-in, which can write data in the Kafka topic to Doris.
 
-## Usage Doris Kafka Connector
+## Version Description
+
+| Connector Version | Kafka Version                 | Doris Version | Java 
Version | 
+| ----------------- | ----------------------------- | ------------- | 
------------ |
+| 1.0.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 1.1.0             | 2.4+                          | 2.0+          | 8        
    | 
+| 24.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+| 25.0.0            | 2.4+                          | 2.0+          | 8        
    | 
+
+## Usage
 
 ### Download
 [doris-kafka-connector](https://doris.apache.org/download)
@@ -19,7 +28,7 @@ maven dependencies
 <dependency>
   <groupId>org.apache.doris</groupId>
   <artifactId>doris-kafka-connector</artifactId>
-  <version>1.0.0</version>
+  <version>25.0.0</version>
 </dependency>
 ```
 
@@ -203,6 +212,7 @@ errors.deadletterqueue.topic.replication.factor=1
 | database.time_zone          | -                                    | UTC     
                                                                             | 
N            | When `converter.mode` is not `normal` mode, it provides a way to 
specify time zone conversion for date data types (such as datetime, date, 
timestamp, etc.). The default is UTC time zone.                                 
                                                                                
                     [...]
 | avro.topic2schema.filepath  | -                                    | -       
                                                                             | 
N            | By reading the locally provided Avro Schema file, the Avro file 
content in the Topic is parsed to achieve decoupling from the Schema 
registration center provided by Confluent. <br/> This configuration needs to be 
used with the `key.converter` or `value.converter` prefix. For example, the 
local Avro Schema file for con [...]
 | record.tablename.field      | -                                    | -       
                                                                             | 
N            | Configure this parameter, data from one kafka topic can flow to 
multiple doris tables. For configuration details, refer to: 
[#58](https://github.com/apache/doris-kafka-connector/pull/58)                  
                                                                                
                                    [...]
+| enable.combine.flush | `true`,<br/> `false` | false | N | Whether to merge 
data from all partitions together and write them. The default value is false. 
When enabled, only at_least_once semantics are guaranteed.|
 
 For other Kafka Connect Sink common configuration items, please refer to: 
[connect_configuring](https://kafka.apache.org/documentation/#connect_configuring)
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to