caozhen1937 commented on a change in pull request #14126:
URL: https://github.com/apache/flink/pull/14126#discussion_r526873487
##########
File path: docs/dev/table/connectors/upsert-kafka.zh.md
##########
@@ -29,36 +29,24 @@ under the License.
* This will be replaced by the TOC
{:toc}
-The Upsert Kafka connector allows for reading data from and writing data into
Kafka topics in the upsert fashion.
+Upsert Kafka 连接器支持以 upsert 方式从 Kafka topic 中读取数据并将数据写入 Kafka topic。
-As a source, the upsert-kafka connector produces a changelog stream, where
each data record represents
-an update or delete event. More precisely, the value in a data record is
interpreted as an UPDATE of
-the last value for the same key, if any (if a corresponding key doesn’t exist
yet, the update will
-be considered an INSERT). Using the table analogy, a data record in a
changelog stream is interpreted
-as an UPSERT aka INSERT/UPDATE because any existing row with the same key is
overwritten. Also, null
-values are interpreted in a special way: a record with a null value represents
a “DELETE”.
+作为 source,upsert-kafka 连接器生产变更日志流,其中每条数据记录代表一个更新或删除事件。更准确地说,数据记录中的 value
被解释为同一 key 的最后一个 value 的 UPDATE,如果有这个 key(如果不存在相应的 key,则该更新被视为
INSERT)。用表来类比,更改日志流中的数据记录被解释为 UPSERT,也称为 INSERT/UPDATE,因为任何具有相同 key
的现有行都被覆盖。另外,空 value 以特殊方式解释:具有空 value 的记录表示“DELETE”。
Review comment:
I agree with you. Keeping the `changelog`.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]