PatrickRen commented on a change in pull request #13876:
URL: https://github.com/apache/flink/pull/13876#discussion_r545600084
##########
File path: docs/dev/table/connectors/kafka.zh.md
##########
@@ -75,185 +72,181 @@ CREATE TABLE kafkaTable (
</div>
</div>
-Connector Options
+连接器配置项
----------------
<table class="table table-bordered">
<thead>
<tr>
- <th class="text-left" style="width: 25%">Option</th>
- <th class="text-center" style="width: 8%">Required</th>
- <th class="text-center" style="width: 7%">Default</th>
- <th class="text-center" style="width: 10%">Type</th>
- <th class="text-center" style="width: 50%">Description</th>
+ <th class="text-left" style="width: 25%">选项</th>
+ <th class="text-center" style="width: 8%">是否必需</th>
+ <th class="text-center" style="width: 7%">默认值</th>
+ <th class="text-center" style="width: 10%">类型</th>
+ <th class="text-center" style="width: 50%">描述</th>
</tr>
</thead>
<tbody>
<tr>
<td><h5>connector</h5></td>
- <td>required</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>必需</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>Specify what connector to use, for Kafka use:
<code>'kafka'</code>.</td>
+ <td>使用的连接器, Kafka 连接器使用: <code>'kafka'</code>.</td>
</tr>
<tr>
<td><h5>topic</h5></td>
- <td>required for sink, optional for source(use 'topic-pattern' instead
if not set)</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>sink 表必需</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>Topic name(s) to read data from when the table is used as source. It
also supports topic list for source by separating topic by semicolon like
<code>'topic-1;topic-2'</code>. Note, only one of "topic-pattern" and "topic"
can be specified for sources. When the table is used as sink, the topic name is
the topic to write data to. Note topic list is not supported for sinks.</td>
+ <td>当表用作 source 时读取数据的 topic 名,亦支持用分号间隔的 topic 列表,如
<code>'topic-1;topic-2'</code>。注意对 source 表而言,'topic' 和 'topic-pattern'
两个选项只能使用其中一个当表被用作 sink 时,该配置表示写入的 topic 名。注意 sink 表不支持 topic 列表</td>
</tr>
<tr>
<td><h5>topic-pattern</h5></td>
- <td>optional</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>可选</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>The regular expression for a pattern of topic names to read from.
All topics with names that match the specified regular expression will be
subscribed by the consumer when the job starts running. Note, only one of
"topic-pattern" and "topic" can be specified for sources.</td>
+ <td>匹配读取 topic 名称的正则表达式。在作业开始运行时,所有匹配该正则表达式的 topic 都将被 Kafka consumer
订阅。注意对 source 表而言,'topic' 和 'topic-pattern' 两个选项只能使用其中一个</td>
</tr>
<tr>
<td><h5>properties.bootstrap.servers</h5></td>
- <td>required</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>必需</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>Comma separated list of Kafka brokers.</td>
+ <td>逗号分隔的 Kafka broker 列表</td>
</tr>
<tr>
<td><h5>properties.group.id</h5></td>
- <td>required by source</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>source 表必需</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>The id of the consumer group for Kafka source, optional for Kafka
sink.</td>
+ <td>Kafka source 的 consumer 组 ID,对 Kafka sink 可选填</td>
</tr>
<tr>
<td><h5>format</h5></td>
- <td>required</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>必需</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>The format used to deserialize and serialize Kafka messages.
- The supported formats are <code>'csv'</code>, <code>'json'</code>,
<code>'avro'</code>, <code>'debezium-json'</code> and <code>'canal-json'</code>.
- Please refer to <a href="{% link
dev/table/connectors/formats/index.zh.md %}">Formats</a> page for more details
and more format options.
+ <td>用来序列化或反序列化 Kafka 消息的格式。
+ 支持的格式有
<code>'csv'</code>、<code>'json'</code>、<code>'avro'</code>、<code>'debezium-json'</code>
和 <code>'canal-json'</code>。
+ 请参阅<a href="{% link dev/table/connectors/formats/index.zh.md
%}">格式</a>页面以获取更多关于 format 的细节和相关配置项。
</td>
</tr>
<tr>
<td><h5>scan.startup.mode</h5></td>
- <td>optional</td>
+ <td>可选</td>
<td style="word-wrap: break-word;">group-offsets</td>
<td>String</td>
- <td>Startup mode for Kafka consumer, valid values are
<code>'earliest-offset'</code>, <code>'latest-offset'</code>,
<code>'group-offsets'</code>, <code>'timestamp'</code> and
<code>'specific-offsets'</code>.
- See the following <a href="#start-reading-position">Start Reading
Position</a> for more details.</td>
+ <td>Kafka consumer
的启动模式。有效的值有:<code>'earliest-offset'</code>、<code>'latest-offset'</code>、<code>'group-offsets'</code>、<code>'timestamp'</code>
和 <code>'specific-offsets'</code>。
+ 请参阅下方<a href="#start-reading-position">起始消费位点</a>一节以获取更多细节。</td>
</tr>
<tr>
<td><h5>scan.startup.specific-offsets</h5></td>
- <td>optional</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>可选</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>Specify offsets for each partition in case of
<code>'specific-offsets'</code> startup mode, e.g.
<code>'partition:0,offset:42;partition:1,offset:300'</code>.
+ <td>在使用 <code>'specific-offsets'</code> 启动模式时为每个 partition 指定位点,例如
<code>'partition:0,offset:42;partition:1,offset:300'</code>。
</td>
</tr>
<tr>
<td><h5>scan.startup.timestamp-millis</h5></td>
- <td>optional</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>可选</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>Long</td>
- <td>Start from the specified epoch timestamp (milliseconds) used in case
of <code>'timestamp'</code> startup mode.</td>
+ <td>在使用 <code>'timestamp'</code> 启动模式时指定启动的时间戳(毫秒单位)</td>
</tr>
<tr>
<td><h5>scan.topic-partition-discovery.interval</h5></td>
- <td>optional</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>可选</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>Duration</td>
- <td>Interval for consumer to discover dynamically created Kafka topics
and partitions periodically.</td>
+ <td>Consumer 定期探测动态创建的 Kafka topic 和 partition 的时间间隔</td>
</tr>
<tr>
<td><h5>sink.partitioner</h5></td>
- <td>optional</td>
- <td style="word-wrap: break-word;">(none)</td>
+ <td>可选</td>
+ <td style="word-wrap: break-word;">(无)</td>
<td>String</td>
- <td>Output partitioning from Flink's partitions into Kafka's partitions.
Valid values are
+ <td>Flink partition 到 Kafka partition 的分区映射关系,有效值有:
<ul>
- <li><code>fixed</code>: each Flink partition ends up in at most one
Kafka partition.</li>
- <li><code>round-robin</code>: a Flink partition is distributed to
Kafka partitions round-robin.</li>
- <li>Custom <code>FlinkKafkaPartitioner</code> subclass: e.g.
<code>'org.mycompany.MyPartitioner'</code>.</li>
+ <li><code>fixed</code>: 每个 Flink partition 最终对应最多一个 Kafka
partition。</li>
+ <li><code>round-robin</code>: Flink partition 按轮循 (round-robin) 的模式对应到
Kafka partition。</li>
+ <li>自定义 <code>FlinkKafkaPartitioner</code> 的子类: 例如
<code>'org.mycompany.MyPartitioner'</code>。</li>
</ul>
</td>
</tr>
<tr>
<td><h5>sink.semantic</h5></td>
- <td>optional</td>
+ <td>可选</td>
<td style="word-wrap: break-word;">at-least-once</td>
<td>String</td>
- <td>Defines the delivery semantic for the Kafka sink. Valid
enumerationns are <code>'at-lease-once'</code>, <code>'exactly-once'</code> and
<code>'none'</code>.
- See <a href='#consistency-guarantees'>Consistency guarantees</a> for
more details. </td>
+ <td>定义 Kafka sink 的语义. 有效值有 are
<code>'at-lease-once'</code>、<code>'exactly-once'</code> 和 <code>'none'</code>.
+ 请参阅<a href='#consistency-guarantees'>一致性保证</a>以获取更多细节。</td>
</tr>
</tbody>
</table>
-Features
+特性
----------------
-### Topic and Partition Discovery
+### Topic 和 Partition 的探测
-The config option `topic` and `topic-pattern` specifies the topics or topic
pattern to consume for source. The config option `topic` can accept topic list
using semicolon separator like 'topic-1;topic-2'.
-The config option `topic-pattern` will use regular expression to discover the
matched topic. For example, if the `topic-pattern` is `test-topic-[0-9]`, then
all topics with names that match the specified regular expression (starting
with `test-topic-` and ending with a single digit)) will be subscribed by the
consumer when the job starts running.
+`topic` 和 `topic-pattern` 配置项决定了 source 消费的 topic 或 topic 的匹配规则。`topic`
配置项可接受使用分号间隔的 topic 列表,例如 `topic-1;topic-2`。
+`topic-pattern` 配置项使用正则表达式来探测匹配的 topic。例如 `topic-pattern` 设置为
`test-topic-[0-9]`,则在作业启动时,所有匹配该正则表达式的 topic(以 `test-topic-` 开头,以一位数字结尾)都将被
consumer 订阅。
-To allow the consumer to discover dynamically created topics after the job
started running, set a non-negative value for
`scan.topic-partition-discovery.interval`. This allows the consumer to discover
partitions of new topics with names that also match the specified pattern.
+为允许 consumer 在作业启动之后探测到动态创建的 topic,请将
`scan.topic-partition-discovery.interval` 配置为一个非负值,从而使 consumer 能够探测匹配名称规则的新
topic 中的 partition。
Review comment:
Fixed in the latest commit~
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]