lirui-apache commented on a change in pull request #11391: [FLINK-16098]
[chinese-translation, Documentation] Translate "Overview" page of "Hive
Integration" into Chinese
URL: https://github.com/apache/flink/pull/11391#discussion_r394767029
##########
File path: docs/dev/table/hive/index.zh.md
##########
@@ -77,29 +77,29 @@ Flink supports the following Hive versions.
- 3.1.1
- 3.1.2
-Please note Hive itself have different features available for different
versions, and these issues are not caused by Flink:
+请注意,Hive本身在不同功能上有着不同的适用版本,这些适配性问题不是由Flink所引起的:
-- Hive built-in functions are supported in 1.2.0 and later.
-- Column constraints, i.e. PRIMARY KEY and NOT NULL, are supported in 3.1.0
and later.
-- Altering table statistics is supported in 1.2.0 and later.
-- `DATE` column statistics are supported in 1.2.0 and later.
-- Writing to ORC tables is not supported in 2.0.x.
+- Hive内置函数,已在1.2.0及更高版本予以支持。
+- 列约束,也就是PRIMARY KEY 和 NOT NULL,已在3.1.0及更高版本予以支持。
+- 更改表的统计信息,已在1.2.0及更高版本予以支持。
+- `DATE`列统计,已在1.2.0及更高版本予以支持。
+- 2.0.x版本不支持写入ORC表。
-### Dependencies
+### 依赖项
-To integrate with Hive, you need to add some extra dependencies to the `/lib/`
directory in Flink distribution
-to make the integration work in Table API program or SQL in SQL Client.
-Alternatively, you can put these dependencies in a dedicated folder, and add
them to classpath with the `-C`
-or `-l` option for Table API program or SQL Client respectively.
+要与Hive集成,您需要在Flink下的`/lib/`目录中添加一些额外的依赖关系,
+使得集成在Table API的程序或SQL Client中的SQL能够起到作用。
Review comment:
```suggestion
以便通过Table API或SQL Client与Hive进行交互。
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services