YngwieWang commented on a change in pull request #9350: [FLINK-13485] 
[chinese-translation] Translate "Table API Example Walkthrough" page into 
Chinese
URL: https://github.com/apache/flink/pull/9350#discussion_r310370268
 
 

 ##########
 File path: docs/getting-started/walkthroughs/table_api.zh.md
 ##########
 @@ -399,27 +403,27 @@ tEnv.registerTableSource("transactions", new 
UnboundedTransactionTableSource)
 </div>
 </div>
 
-And that's it, a fully functional, stateful, distributed streaming application!
-The query continuously consumes the stream of transactions, computes the 
hourly spendings, and emits results as soon as they are ready.
-Since the input is unbounded, the query keeps running until it is manually 
stopped.
-And because the Job uses time window-based aggregations, Flink can perform 
specific optimizations such as state clean up when the framework knows that no 
more records will arrive for a particular window.
+这就是一个功能齐全、有状态的分布式流式应用!
+这个查询作业会不断得消费交易事件流,计算每小时的消费额,然后实时得输出结果。
+由于输入是无界的,这个查询作业会一直运行下去,直到被手动停下来。
+因为这个作业使用了基于时间的聚合,Flink可以使用一些特定的优化,比如当系统知道一个特定的窗口不会再有新的数据来临,它就会对状态进行清理。
 
 {% highlight raw %}
-# Query 3 output showing account id, timestamp, and amount
+# 查询三的输出,显示了账户id,时间戳,和消费总额
 
-# These rows are calculated continuously over the hour 
-# and output immediately at the end of the hour
 > 1, 2019-01-01 00:00:00.0, $567.87
 > 2, 2019-01-01 00:00:00.0, $726.23
 
-# Flink begins computing these rows as soon as 
-# as the first record for the window arrives
+这些行是在这一小时中持续不断得进行计算所得,然后在这一小时结束时立刻被输出
+
 > 1, 2019-01-01 01:00:00.0, $686.87
 > 2, 2019-01-01 01:00:00.0, $810.06
 
+当接收到该窗口的第一条数据时,Flink就开始进行计算这些数据了。
 
 Review comment:
   ```suggestion
   # 当接收到该窗口的第一条数据时,Flink 就开始计算了
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to