libenchao commented on a change in pull request #10822: 
[FLINK-15081][docs-zh]Translate "Concepts & Common API" page of Table…
URL: https://github.com/apache/flink/pull/10822#discussion_r382670534
 
 

 ##########
 File path: docs/dev/table/common.zh.md
 ##########
 @@ -1091,17 +1070,17 @@ val dsTuple: DataSet[(String, Int)] = 
tableEnv.toDataSet[(String, Int)](table)
 
 {% top %}
 
-### Mapping of Data Types to Table Schema
+### 数据类型到 Table Schema 的映射
 
-Flink's DataStream and DataSet APIs support very diverse types. Composite 
types such as Tuples (built-in Scala and Flink Java tuples), POJOs, Scala case 
classes, and Flink's Row type allow for nested data structures with multiple 
fields that can be accessed in table expressions. Other types are treated as 
atomic types. In the following, we describe how the Table API converts these 
types into an internal row representation and show examples of converting a 
`DataStream` into a `Table`.
+Flink 的 DataStream 和 DataSet APIs 支持多样的数据类型。例如 Tuple(Scala 内置以及Flink Java 
tuple)、POJO 类型、Scala case class 类型以及 Flink 的 Row 
类型等允许嵌套且有多个可在表的表达式中访问的字段的复合数据类型。其他类型被视为原子类型。 Composite types such as Tuples 
(built-in Scala and Flink Java tuples), POJOs, Scala case classes, and Flink's 
Row type allow for nested data structures with multiple fields that can be 
accessed in table expressions. 下面,我们讨论 Table API 如何将这些数据类型类型转换为内部 row 
表示形式,并提供将流数据集转换成表的样例。
 
 Review comment:
   forgets to remove original english doc.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to