lirui-apache commented on a change in pull request #11391: [FLINK-16098]
[chinese-translation, Documentation] Translate "Overview" page of "Hive
Integration" into Chinese
URL: https://github.com/apache/flink/pull/11391#discussion_r394768081
##########
File path: docs/dev/table/hive/index.zh.md
##########
@@ -77,29 +77,29 @@ Flink supports the following Hive versions.
- 3.1.1
- 3.1.2
-Please note Hive itself have different features available for different
versions, and these issues are not caused by Flink:
+请注意,Hive本身在不同功能上有着不同的适用版本,这些适配性问题不是由Flink所引起的:
-- Hive built-in functions are supported in 1.2.0 and later.
-- Column constraints, i.e. PRIMARY KEY and NOT NULL, are supported in 3.1.0
and later.
-- Altering table statistics is supported in 1.2.0 and later.
-- `DATE` column statistics are supported in 1.2.0 and later.
-- Writing to ORC tables is not supported in 2.0.x.
+- Hive内置函数,已在1.2.0及更高版本予以支持。
+- 列约束,也就是PRIMARY KEY 和 NOT NULL,已在3.1.0及更高版本予以支持。
+- 更改表的统计信息,已在1.2.0及更高版本予以支持。
+- `DATE`列统计,已在1.2.0及更高版本予以支持。
+- 2.0.x版本不支持写入ORC表。
-### Dependencies
+### 依赖项
-To integrate with Hive, you need to add some extra dependencies to the `/lib/`
directory in Flink distribution
-to make the integration work in Table API program or SQL in SQL Client.
-Alternatively, you can put these dependencies in a dedicated folder, and add
them to classpath with the `-C`
-or `-l` option for Table API program or SQL Client respectively.
+要与Hive集成,您需要在Flink下的`/lib/`目录中添加一些额外的依赖关系,
+使得集成在Table API的程序或SQL Client中的SQL能够起到作用。
+或者,您可以将这些依赖项放在专用文件夹中,并分别使用Table API程序或SQL Client的`-C`或`-l`选项将它们添加到classpath中。
-Apache Hive is built on Hadoop, so you need Hadoop dependency first, please
refer to
+Apache Hive 是基于Hadoop之上构建的, 首先您需要hadoop的依赖,请参考
[Providing Hadoop classes]({{ site.baseurl
}}/ops/deployment/hadoop.html#providing-hadoop-classes).
-There are two ways to add Hive dependencies. First is to use Flink's bundled
Hive jars. You can choose a bundled Hive jar according to the version of the
metastore you use. Second is to add each of the required jars separately. The
second way can be useful if the Hive version you're using is not listed here.
+有两种添加Hive依赖项的方法。第一种是使用Flink捆绑Hive的Jar包。您可以根据使用的metastore的版本来选择捆绑的Hive
jar。第二个方式是分别添加每个所需的jar包。如果您使用的Hive版本尚未在此处列出,则第二种方法会更适合。
-#### Using bundled hive jar
+#### 使用捆绑的Hive jar
+
+下表列出了所有可捆绑的hive jars。您可以在Flink发行版的`/lib/` 目录中去选择一个。
Review comment:
```suggestion
下表列出了所有可用的Hive jar。您可以选择一个并放在Flink发行版的`/lib/` 目录中。
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services