wuchong commented on a change in pull request #13801:
URL: https://github.com/apache/flink/pull/13801#discussion_r513141770



##########
File path: docs/dev/table/connectors/formats/avro-confluent.zh.md
##########
@@ -72,50 +72,50 @@ CREATE TABLE user_behavior (
 </div>
 </div>
 
-Format Options
+Format 参数
 ----------------
 
 <table class="table table-bordered">
     <thead>
       <tr>
-        <th class="text-left" style="width: 25%">Option</th>
-        <th class="text-center" style="width: 8%">Required</th>
-        <th class="text-center" style="width: 7%">Default</th>
-        <th class="text-center" style="width: 10%">Type</th>
-        <th class="text-center" style="width: 50%">Description</th>
+        <th class="text-left" style="width: 25%">参数</th>
+        <th class="text-center" style="width: 8%">是否必选</th>
+        <th class="text-center" style="width: 7%">默认值</th>
+        <th class="text-center" style="width: 10%">类型</th>
+        <th class="text-center" style="width: 50%">描述</th>
       </tr>
     </thead>
     <tbody>
     <tr>
       <td><h5>format</h5></td>
-      <td>required</td>
+      <td>必选</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
-      <td>Specify what format to use, here should be 
<code>'avro-confluent'</code>.</td>
+      <td>指定要使用的格式,这里应该是 <code>'avro-confluent'</code>.</td>
     </tr>
     <tr>
       <td><h5>avro-confluent.schema-registry.url</h5></td>
-      <td>required</td>
+      <td>必选</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
-      <td>The URL of the Confluent Schema Registry to fetch/register 
schemas</td>
+      <td>用于获取/注册 schemas 的 Confluent Schema Registry 的URL </td>
     </tr>
     <tr>
       <td><h5>avro-confluent.schema-registry.subject</h5></td>
-      <td>required by sink</td>
+      <td>sink 必选</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
-      <td>The Confluent Schema Registry subject under which to register the 
schema used by this format during serialization</td>
+      <td>Confluent Schema Registry主题,用于在序列化期间注册此格式使用的 schema </td>
     </tr>
     </tbody>
 </table>
 
-Data Type Mapping
+数据类型映射
 ----------------
 
-Currently, Apache Flink always uses the table schema to derive the Avro reader 
schema during deserialization and Avro writer schema during serialization. 
Explicitly defining an Avro schema is not supported yet.
-See the [Apache Avro Format]({% link 
dev/table/connectors/formats/avro.zh.md%}#data-type-mapping) for the mapping 
between Avro and Flink DataTypes. 
+目前 Apache Flink 都是从 table schema 去推断反序列化期间的 Avro reader schema 和序列化期间的 Avro 
writer schema。显式地定义 Avro schema 暂不支持。
+[Apache Avro Format]({% link 
dev/table/connectors/formats/avro.zh.md%}#data-type-mapping)中描述了flink数据和Avro数据的对应关系。
 

Review comment:
       ```suggestion
   [Apache Avro Format]({% link 
dev/table/connectors/formats/avro.zh.md%}#data-type-mapping)中描述了 Flink 数据类型和 
Avro 类型的对应关系。 
   ```

##########
File path: docs/dev/table/connectors/formats/avro-confluent.zh.md
##########
@@ -72,50 +72,50 @@ CREATE TABLE user_behavior (
 </div>
 </div>
 
-Format Options
+Format 参数
 ----------------
 
 <table class="table table-bordered">
     <thead>
       <tr>
-        <th class="text-left" style="width: 25%">Option</th>
-        <th class="text-center" style="width: 8%">Required</th>
-        <th class="text-center" style="width: 7%">Default</th>
-        <th class="text-center" style="width: 10%">Type</th>
-        <th class="text-center" style="width: 50%">Description</th>
+        <th class="text-left" style="width: 25%">参数</th>
+        <th class="text-center" style="width: 8%">是否必选</th>
+        <th class="text-center" style="width: 7%">默认值</th>
+        <th class="text-center" style="width: 10%">类型</th>
+        <th class="text-center" style="width: 50%">描述</th>
       </tr>
     </thead>
     <tbody>
     <tr>
       <td><h5>format</h5></td>
-      <td>required</td>
+      <td>必选</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
-      <td>Specify what format to use, here should be 
<code>'avro-confluent'</code>.</td>
+      <td>指定要使用的格式,这里应该是 <code>'avro-confluent'</code>.</td>
     </tr>
     <tr>
       <td><h5>avro-confluent.schema-registry.url</h5></td>
-      <td>required</td>
+      <td>必选</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
-      <td>The URL of the Confluent Schema Registry to fetch/register 
schemas</td>
+      <td>用于获取/注册 schemas 的 Confluent Schema Registry 的URL </td>
     </tr>
     <tr>
       <td><h5>avro-confluent.schema-registry.subject</h5></td>
-      <td>required by sink</td>
+      <td>sink 必选</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
-      <td>The Confluent Schema Registry subject under which to register the 
schema used by this format during serialization</td>
+      <td>Confluent Schema Registry主题,用于在序列化期间注册此格式使用的 schema </td>
     </tr>
     </tbody>
 </table>
 
-Data Type Mapping
+数据类型映射
 ----------------
 
-Currently, Apache Flink always uses the table schema to derive the Avro reader 
schema during deserialization and Avro writer schema during serialization. 
Explicitly defining an Avro schema is not supported yet.
-See the [Apache Avro Format]({% link 
dev/table/connectors/formats/avro.zh.md%}#data-type-mapping) for the mapping 
between Avro and Flink DataTypes. 
+目前 Apache Flink 都是从 table schema 去推断反序列化期间的 Avro reader schema 和序列化期间的 Avro 
writer schema。显式地定义 Avro schema 暂不支持。
+[Apache Avro Format]({% link 
dev/table/connectors/formats/avro.zh.md%}#data-type-mapping)中描述了flink数据和Avro数据的对应关系。
 
 
-In addition to the types listed there, Flink supports reading/writing nullable 
types. Flink maps nullable types to Avro `union(something, null)`, where 
`something` is the Avro type converted from Flink type.
+除了此处列出的类型之外,Flink 还支持读取/写入可为空的类型。 Flink 将可为空的类型映射到 Avro `union(something, 
null)`, 其中 `something` 是从 Flink 类型转换的 Avro 类型。

Review comment:
       ```suggestion
   除了此处列出的类型之外,Flink 还支持读取/写入可为空(nullable)的类型。 Flink 将可为空的类型映射到 Avro 
`union(something, null)`, 其中 `something` 是从 Flink 类型转换的 Avro 类型。
   ```

##########
File path: docs/dev/table/connectors/formats/avro-confluent.zh.md
##########
@@ -29,27 +29,27 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-The Avro Schema Registry (``avro-confluent``) format allows you to read 
records that were serialized by the 
``io.confluent.kafka.serializers.KafkaAvroSerializer`` and to write records 
that can in turn be read by the 
``io.confluent.kafka.serializers.KafkaAvroDeserializer``. 
+Avro Schema Registry (``avro-confluent``) 格式能让你读取被 
``io.confluent.kafka.serializers.KafkaAvroSerializer``序列化的记录并可以写入成 
``io.confluent.kafka.serializers.KafkaAvroDeserializer``反序列化的记录。

Review comment:
       ```suggestion
   Avro Schema Registry (``avro-confluent``) 格式能让你读取被 
``io.confluent.kafka.serializers.KafkaAvroSerializer``序列化的记录,以及可以写入成能被 
``io.confluent.kafka.serializers.KafkaAvroDeserializer``反序列化的记录。
   ```

##########
File path: docs/dev/table/connectors/formats/avro-confluent.zh.md
##########
@@ -29,27 +29,27 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-The Avro Schema Registry (``avro-confluent``) format allows you to read 
records that were serialized by the 
``io.confluent.kafka.serializers.KafkaAvroSerializer`` and to write records 
that can in turn be read by the 
``io.confluent.kafka.serializers.KafkaAvroDeserializer``. 
+Avro Schema Registry (``avro-confluent``) 格式能让你读取被 
``io.confluent.kafka.serializers.KafkaAvroSerializer``序列化的记录并可以写入成 
``io.confluent.kafka.serializers.KafkaAvroDeserializer``反序列化的记录。
 
-When reading (deserializing) a record with this format the Avro writer schema 
is fetched from the configured Confluent Schema Registry based on the schema 
version id encoded in the record while the reader schema is inferred from table 
schema. 
+当以这种格式读取(反序列化)记录时,将根据记录中编码的 schema 版本 id 从配置的 Confluent Schema Registry 中获取 
Avro writer schema ,而从 table schema 中推断出 reader schema。
 
-When writing (serializing) a record with this format the Avro schema is 
inferred from the table schema and used to retrieve a schema id to be encoded 
with the data. The lookup is performed with in the configured Confluent Schema 
Registry under the 
[subject](https://docs.confluent.io/current/schema-registry/index.html#schemas-subjects-and-topics)
 given in `avro-confluent.schema-registry.subject`.
+当以这种格式写入(序列化)记录时,Avro schema 是从 table schema 中推断出来的,并用于检索要与数据一起编码的 schema 
id。检索是在 Confluent Schema Registry 配置中的 `avro-confluent.schema-registry.subject` 
中指定的[subject](https://docs.confluent.io/current/schema-registry/index.html#schemas-subjects-and-topics)下执行的。

Review comment:
       ```suggestion
   当以这种格式写入(序列化)记录时,Avro schema 是从 table schema 中推断出来的,并会用来检索要与数据一起编码的 schema 
id。我们会在配置的 Confluent Schema Registry 中,配置的 
[subject](https://docs.confluent.io/current/schema-registry/index.html#schemas-subjects-and-topics)
 下,检索 schema id。subject 通过 `avro-confluent.schema-registry.subject` 参数来制定。
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to