wuchong commented on a change in pull request #8918: 
[FLINK-12944][docs]Translate Streaming File Sink page into Chinese
URL: https://github.com/apache/flink/pull/8918#discussion_r298822550
 
 

 ##########
 File path: docs/dev/connectors/streamfile_sink.zh.md
 ##########
 @@ -23,30 +23,23 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-This connector provides a Sink that writes partitioned files to filesystems
-supported by the [Flink `FileSystem` abstraction]({{ 
site.baseurl}}/ops/filesystems/index.html).
+这个连接器支持向 [Flink `FileSystem` abstraction]({{ 
site.baseurl}}/zh/ops/filesystems/index.html)支持
+的文件系统写入分区文件。
 
-Since in streaming the input is potentially infinite, the streaming file sink 
writes data
-into buckets. The bucketing behaviour is configurable but a useful default is 
time-based
-bucketing where we start writing a new bucket every hour and thus get
-individual files that each contain a part of the infinite output stream.
+由于在流处理中输入可能是无限的,所以流处理的文件 sink 会将数据写入到桶中。如何分桶是可以配置的,一种有效的默认
+策略是基于时间的分桶,这种策略每个小时写入一个新的桶,这些桶各包含了无限输出流的一部分数据。
 
-Within a bucket, we further split the output into smaller part files based on a
-rolling policy. This is useful to prevent individual bucket files from getting
-too big. This is also configurable but the default policy rolls files based on
-file size and a timeout, *i.e* if no new data was written to a part file.
+在一个桶内部,会进一步将输出基于滚动策略切分成更小的文件。这有助于防止桶文件变得过大。滚动策略也是可以配置的,默认
+策略会根据文件大小和超时时间来滚动文件,超时时间是指没有新数据写入部分文件的时间。
 
 Review comment:
   ```suggestion
   策略会根据文件大小和超时时间来滚动文件,超时时间是指没有新数据写入部分文件(part file)的时间。
   ```
   
   这里我觉得 part file 是 Stream File Sink 的一个专用术语,最好能显示原文,更容易让用户理解。

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to