hapihu commented on a change in pull request #16852:
URL: https://github.com/apache/flink/pull/16852#discussion_r695828546



##########
File path: docs/content.zh/docs/dev/datastream/overview.md
##########
@@ -28,62 +28,45 @@ specific language governing permissions and limitations
 under the License.
 -->
 
+<a name="flink-datastream-api-programming-guide"></a>
+
 # Flink DataStream API 编程指南 
 
-DataStream programs in Flink are regular programs that implement 
transformations on data streams
-(e.g., filtering, updating state, defining windows, aggregating). The data 
streams are initially created from various
-sources (e.g., message queues, socket streams, files). Results are returned 
via sinks, which may for
-example write the data to files, or to standard output (for example the 
command line
-terminal). Flink programs run in a variety of contexts, standalone, or 
embedded in other programs.
-The execution can happen in a local JVM, or on clusters of many machines.
+Flink 中的 DataStream 
程序是对数据流(例如过滤、更新状态、定义窗口、聚合)进行转换的常规程序。数据流最初是从各种源(例如消息队列、套接字流、文件)创建的。结果通过 sink 
返回,例如可以将数据写入文件或标准输出(例如命令行终端)。Flink 程序可以在各种上下文中运行,可以独立运行,也可以嵌入到其它程序中。任务执行可以发生在本地 
JVM 中,也可以发生在多台机器的集群上。
+
+为了创建你自己的 Flink DataStream 程序,我们建议你从 [Flink 
程序剖析](#anatomy-of-a-flink-program)开始,然后逐渐添加自己的[流转换](({{< ref 
"docs/dev/datastream/operators/overview" >}}))。其余部分用作额外算子和高级特性的参考。

Review comment:
       OK,take your advice. I will update it as soon as possible.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to