RocMarshal commented on a change in pull request #13271:
URL: https://github.com/apache/flink/pull/13271#discussion_r479002369
##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
under the License.
-->
-The logging in Flink is implemented using the slf4j logging interface. As
underlying logging framework, log4j2 is used. We also provide logback
configuration files and pass them to the JVM's as properties. Users willing to
use logback instead of log4j2 can just exclude log4j2 (or delete it from the
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback
日志配置,只要将其配置文件作为属性传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/
文件夹中删除它)即可。
Review comment:
```suggestion
Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback
日志配置,只要将其配置文件作为参数传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/
文件夹中删除它)即可。
```
##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
under the License.
-->
-The logging in Flink is implemented using the slf4j logging interface. As
underlying logging framework, log4j2 is used. We also provide logback
configuration files and pass them to the JVM's as properties. Users willing to
use logback instead of log4j2 can just exclude log4j2 (or delete it from the
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback
日志配置,只要将其配置文件作为属性传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/
文件夹中删除它)即可。
* This will be replaced by the TOC
{:toc}
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
-Log4j2 is controlled using property files. In Flink's case, the file is
usually called `log4j.properties`. We pass the filename and location of this
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
-Flink ships with the following default properties files:
+Log4j2 是使用属性文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
-- `log4j-cli.properties`: Used by the Flink command line client (e.g. `flink
run`) (not code executed on the cluster)
-- `log4j-session.properties`: Used by the Flink command line client when
starting a YARN or Kubernetes session (`yarn-session.sh`,
`kubernetes-session.sh`)
-- `log4j.properties`: JobManager/Taskmanager logs (both standalone and YARN)
+Flink 附带以下默认属性文件:
-### Compatibility with Log4j1
+- `log4j-cli.properties`:由 Flink 命令行客户端使用(例如 `flink run`)(不包括在集群上执行的代码)
+- `log4j-session.properties`:Flink 命令行客户端在启动 YARN 或 Kubernetes session
时使用(`yarn-session.sh`,`kubernetes-session.sh`)
+- `log4j.properties`:作为 JobManager/TaskManager 日志配置使用(standalone 和 YARN
两种模式下皆使用)
-Flink ships with the [Log4j API
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),
allowing existing applications that work against Log4j1 classes to continue
working.
+<a name="compatibility-with-log4j1"></a>
-If you have custom Log4j1 properties files or code that relies on Log4j1,
please check out the official Log4j
[compatibility](https://logging.apache.org/log4j/2.x/manual/compatibility.html)
and [migration](https://logging.apache.org/log4j/2.x/manual/migration.html)
guides.
+### 与 Log4j1 的兼容性
-## Configuring Log4j1
+Flink 附带了 [Log4j API
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),使得对
Log4j1 工作的现有应用程序继续工作。
-To use Flink with Log4j1 you must ensure that:
-- `org.apache.logging.log4j:log4j-core`,
`org.apache.logging.log4j:log4j-slf4j-impl` and
`org.apache.logging.log4j:log4j-1.2-api` are not on the classpath,
-- `log4j:log4j`, `org.slf4j:slf4j-log4j12`,
`org.apache.logging.log4j:log4j-to-slf4j` and
`org.apache.logging.log4j:log4j-api` are on the classpath.
+如果你有基于 Log4j1 的自定义配置文件或代码,请查看官方 Log4j
[兼容性](https://logging.apache.org/log4j/2.x/manual/compatibility.html)和[迁移](https://logging.apache.org/log4j/2.x/manual/migration.html)指南。
-In the IDE this means you have to replace such dependencies defined in your
pom, and possibly add exclusions on dependencies that transitively depend on
them.
+<a name="configuring-log4j1"></a>
-For Flink distributions this means you have to
-- remove the `log4j-core`, `log4j-slf4j-impl` and `log4j-1.2-api` jars from
the `lib` directory,
-- add the `log4j`, `slf4j-log4j12` and `log4j-to-slf4j` jars to the `lib`
directory,
-- replace all log4j properties files in the `conf` directory with
Log4j1-compliant versions.
+## 配置 Log4j1
-## Configuring logback
+要将 Flink 与 Log4j1 一起使用,必须确保:
+- Classpath 中不存在
`org.apache.logging.log4j:log4j-core`,`org.apache.logging.log4j:log4j-slf4j-impl`
和 `org.apache.logging.log4j:log4j-1.2-api`,
+- 且 Classpath 中存在
`log4j:log4j`,`org.slf4j:slf4j-log4j12`,`org.apache.logging.log4j:log4j-to-slf4j`
和 `org.apache.logging.log4j:log4j-api`。
-For users and developers alike it is important to control the logging
framework.
-The configuration of the logging framework is exclusively done by
configuration files.
-The configuration file either has to be specified by setting the environment
property `-Dlogback.configurationFile=<file>` or by putting `logback.xml` in
the classpath.
-The `conf` directory contains a `logback.xml` file which can be modified and
is used if Flink is started outside of an IDE and with the provided starting
scripts.
-The provided `logback.xml` has the following form:
+在 IDE 中,这意味着你必须替换在 pom 文件中定义的依赖项,并尽可能在传递依赖于它们的依赖项上添加排除项。
+
+对于 Flink 发行版,这意味着你必须
+- 从 `lib` 目录中移除 `log4j-core`,`log4j-slf4j-impl` 和 `log4j-1.2-api` jars,
+- 向 `lib` 目录中添加 `log4j`,`slf4j-log4j12` 和 `log4j-to-slf4j` jars,
+- 用兼容的 Log4j1 版本替换 `conf` 目录中的所有 log4j 属性文件。
Review comment:
```suggestion
- 用兼容的 Log4j1 版本替换 `conf` 目录中的所有 log4j 配置文件。
```
##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
under the License.
-->
-The logging in Flink is implemented using the slf4j logging interface. As
underlying logging framework, log4j2 is used. We also provide logback
configuration files and pass them to the JVM's as properties. Users willing to
use logback instead of log4j2 can just exclude log4j2 (or delete it from the
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback
日志配置,只要将其配置文件作为属性传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/
文件夹中删除它)即可。
* This will be replaced by the TOC
{:toc}
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
-Log4j2 is controlled using property files. In Flink's case, the file is
usually called `log4j.properties`. We pass the filename and location of this
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
-Flink ships with the following default properties files:
+Log4j2 是使用属性文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
Review comment:
```suggestion
Log4j2 是使用配置文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
```
##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
under the License.
-->
-The logging in Flink is implemented using the slf4j logging interface. As
underlying logging framework, log4j2 is used. We also provide logback
configuration files and pass them to the JVM's as properties. Users willing to
use logback instead of log4j2 can just exclude log4j2 (or delete it from the
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback
日志配置,只要将其配置文件作为属性传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/
文件夹中删除它)即可。
* This will be replaced by the TOC
{:toc}
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
-Log4j2 is controlled using property files. In Flink's case, the file is
usually called `log4j.properties`. We pass the filename and location of this
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
-Flink ships with the following default properties files:
+Log4j2 是使用属性文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
-- `log4j-cli.properties`: Used by the Flink command line client (e.g. `flink
run`) (not code executed on the cluster)
-- `log4j-session.properties`: Used by the Flink command line client when
starting a YARN or Kubernetes session (`yarn-session.sh`,
`kubernetes-session.sh`)
-- `log4j.properties`: JobManager/Taskmanager logs (both standalone and YARN)
+Flink 附带以下默认属性文件:
Review comment:
```suggestion
Flink 附带以下默认日志配置文件:
```
##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
under the License.
-->
-The logging in Flink is implemented using the slf4j logging interface. As
underlying logging framework, log4j2 is used. We also provide logback
configuration files and pass them to the JVM's as properties. Users willing to
use logback instead of log4j2 can just exclude log4j2 (or delete it from the
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback
日志配置,只要将其配置文件作为属性传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/
文件夹中删除它)即可。
* This will be replaced by the TOC
{:toc}
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
-Log4j2 is controlled using property files. In Flink's case, the file is
usually called `log4j.properties`. We pass the filename and location of this
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
-Flink ships with the following default properties files:
+Log4j2 是使用属性文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
-- `log4j-cli.properties`: Used by the Flink command line client (e.g. `flink
run`) (not code executed on the cluster)
-- `log4j-session.properties`: Used by the Flink command line client when
starting a YARN or Kubernetes session (`yarn-session.sh`,
`kubernetes-session.sh`)
-- `log4j.properties`: JobManager/Taskmanager logs (both standalone and YARN)
+Flink 附带以下默认属性文件:
-### Compatibility with Log4j1
+- `log4j-cli.properties`:由 Flink 命令行客户端使用(例如 `flink run`)(不包括在集群上执行的代码)
+- `log4j-session.properties`:Flink 命令行客户端在启动 YARN 或 Kubernetes session
时使用(`yarn-session.sh`,`kubernetes-session.sh`)
+- `log4j.properties`:作为 JobManager/TaskManager 日志配置使用(standalone 和 YARN
两种模式下皆使用)
-Flink ships with the [Log4j API
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),
allowing existing applications that work against Log4j1 classes to continue
working.
+<a name="compatibility-with-log4j1"></a>
-If you have custom Log4j1 properties files or code that relies on Log4j1,
please check out the official Log4j
[compatibility](https://logging.apache.org/log4j/2.x/manual/compatibility.html)
and [migration](https://logging.apache.org/log4j/2.x/manual/migration.html)
guides.
+### 与 Log4j1 的兼容性
-## Configuring Log4j1
+Flink 附带了 [Log4j API
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),使得对
Log4j1 工作的现有应用程序继续工作。
-To use Flink with Log4j1 you must ensure that:
-- `org.apache.logging.log4j:log4j-core`,
`org.apache.logging.log4j:log4j-slf4j-impl` and
`org.apache.logging.log4j:log4j-1.2-api` are not on the classpath,
-- `log4j:log4j`, `org.slf4j:slf4j-log4j12`,
`org.apache.logging.log4j:log4j-to-slf4j` and
`org.apache.logging.log4j:log4j-api` are on the classpath.
+如果你有基于 Log4j1 的自定义配置文件或代码,请查看官方 Log4j
[兼容性](https://logging.apache.org/log4j/2.x/manual/compatibility.html)和[迁移](https://logging.apache.org/log4j/2.x/manual/migration.html)指南。
-In the IDE this means you have to replace such dependencies defined in your
pom, and possibly add exclusions on dependencies that transitively depend on
them.
+<a name="configuring-log4j1"></a>
-For Flink distributions this means you have to
-- remove the `log4j-core`, `log4j-slf4j-impl` and `log4j-1.2-api` jars from
the `lib` directory,
-- add the `log4j`, `slf4j-log4j12` and `log4j-to-slf4j` jars to the `lib`
directory,
-- replace all log4j properties files in the `conf` directory with
Log4j1-compliant versions.
+## 配置 Log4j1
-## Configuring logback
+要将 Flink 与 Log4j1 一起使用,必须确保:
+- Classpath 中不存在
`org.apache.logging.log4j:log4j-core`,`org.apache.logging.log4j:log4j-slf4j-impl`
和 `org.apache.logging.log4j:log4j-1.2-api`,
+- 且 Classpath 中存在
`log4j:log4j`,`org.slf4j:slf4j-log4j12`,`org.apache.logging.log4j:log4j-to-slf4j`
和 `org.apache.logging.log4j:log4j-api`。
-For users and developers alike it is important to control the logging
framework.
-The configuration of the logging framework is exclusively done by
configuration files.
-The configuration file either has to be specified by setting the environment
property `-Dlogback.configurationFile=<file>` or by putting `logback.xml` in
the classpath.
-The `conf` directory contains a `logback.xml` file which can be modified and
is used if Flink is started outside of an IDE and with the provided starting
scripts.
-The provided `logback.xml` has the following form:
+在 IDE 中,这意味着你必须替换在 pom 文件中定义的依赖项,并尽可能在传递依赖于它们的依赖项上添加排除项。
+
+对于 Flink 发行版,这意味着你必须
+- 从 `lib` 目录中移除 `log4j-core`,`log4j-slf4j-impl` 和 `log4j-1.2-api` jars,
+- 向 `lib` 目录中添加 `log4j`,`slf4j-log4j12` 和 `log4j-to-slf4j` jars,
+- 用兼容的 Log4j1 版本替换 `conf` 目录中的所有 log4j 属性文件。
+
+<a name="configuring-logback"></a>
+
+## 配置 logback
+
+对于用户和开发人员来说,控制日志框架非常重要。日志框架的配置完全由配置文件完成。必须通过设置环境属性
`-Dlogback.configurationFile=<file>` 或将 `logback.xml` 放在 classpath
中来指定配置文件。`conf` 目录包含一个 `logback.xml` 文件,该文件可以修改,如果使用附带的启动脚本在 IDE 之外启动 Flink
则会使用该日志配置文件。提供的 `logback.xml` 具有以下格式:
Review comment:
```suggestion
对于用户和开发人员来说,控制日志框架非常重要。日志框架的配置完全由配置文件完成。必须通过设置环境参数
`-Dlogback.configurationFile=<file>` 或将 `logback.xml` 放在 classpath
中来指定配置文件。`conf` 目录包含一个 `logback.xml` 文件,该文件可以修改,如果使用附带的启动脚本在 IDE 之外启动 Flink
则会使用该日志配置文件。提供的 `logback.xml` 具有以下格式:
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]