[flink-web] 04/04: Rebuild website (release-1.12.2)

2021-03-03 Thread roman
This is an automated email from the ASF dual-hosted git repository.

roman pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit fb9a6ea45d3d8bf944697ef619aeb6b21973965e
Author: Roman Khachatryan 
AuthorDate: Wed Mar 3 16:34:02 2021 +0100

Rebuild website (release-1.12.2)
---
 content/blog/feed.xml  | 256 -
 content/blog/index.html|  38 +++---
 content/blog/page10/index.html |  38 +++---
 content/blog/page11/index.html |  40 ---
 content/blog/page12/index.html |  40 ---
 content/blog/page13/index.html |  40 ---
 content/blog/page14/index.html |  40 ---
 content/blog/page15/index.html |  25 
 content/blog/page2/index.html  |  38 +++---
 content/blog/page3/index.html  |  41 ---
 content/blog/page4/index.html  |  39 ---
 content/blog/page5/index.html  |  36 +++---
 content/blog/page6/index.html  |  36 +++---
 content/blog/page7/index.html  |  36 +++---
 content/blog/page8/index.html  |  38 +++---
 content/blog/page9/index.html  |  38 +++---
 content/downloads.html |  33 --
 content/index.html |   8 +-
 content/q/gradle-quickstart.sh |   2 +-
 content/q/quickstart-scala.sh  |   2 +-
 content/q/quickstart.sh|   2 +-
 content/q/sbt-quickstart.sh|   2 +-
 content/zh/downloads.html  |  37 +++---
 content/zh/index.html  |   8 +-
 24 files changed, 651 insertions(+), 262 deletions(-)

diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index c63e53b..750a7af 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -7,6 +7,233 @@
 https://flink.apache.org/blog/feed.xml; rel="self" 
type="application/rss+xml" />
 
 
+Apache Flink 1.12.2 Released
+pThe Apache Flink community released the next bugfix 
version of the Apache Flink 1.12 series./p
+
+pThis release includes 83 fixes and minor improvements for Flink 
1.12.1. The list below includes a detailed list of all fixes and 
improvements./p
+
+pWe highly recommend all users to upgrade to Flink 1.12.2./p
+
+pUpdated Maven dependencies:/p
+
+div class=highlightprecode 
class=language-xmlspan 
class=ntlt;dependencygt;/span
+  span 
class=ntlt;groupIdgt;/spanorg.apache.flinkspan
 class=ntlt;/groupIdgt;/span
+  span 
class=ntlt;artifactIdgt;/spanflink-javaspan
 class=ntlt;/artifactIdgt;/span
+  span 
class=ntlt;versiongt;/span1.12.2span 
class=ntlt;/versiongt;/span
+span class=ntlt;/dependencygt;/span
+span class=ntlt;dependencygt;/span
+  span 
class=ntlt;groupIdgt;/spanorg.apache.flinkspan
 class=ntlt;/groupIdgt;/span
+  span 
class=ntlt;artifactIdgt;/spanflink-streaming-java_2.11span
 class=ntlt;/artifactIdgt;/span
+  span 
class=ntlt;versiongt;/span1.12.2span 
class=ntlt;/versiongt;/span
+span class=ntlt;/dependencygt;/span
+span class=ntlt;dependencygt;/span
+  span 
class=ntlt;groupIdgt;/spanorg.apache.flinkspan
 class=ntlt;/groupIdgt;/span
+  span 
class=ntlt;artifactIdgt;/spanflink-clients_2.11span
 class=ntlt;/artifactIdgt;/span
+  span 
class=ntlt;versiongt;/span1.12.2span 
class=ntlt;/versiongt;/span
+span 
class=ntlt;/dependencygt;/span/code/pre/div
+
+pYou can find the binaries on the updated a 
href=/downloads.htmlDownloads page/a./p
+
+pList of resolved issues:/p
+
+h2Sub-task
+/h2
+ul
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-21070FLINK-21070/a;]
 - Overloaded aggregate functions cause converter errors
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-21486FLINK-21486/a;]
 - Add sanity check when switching from Rocks to Heap timers
+/li
+/ul
+
+h2Bug
+/h2
+ul
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-12461FLINK-12461/a;]
 - Document binary compatibility situation with Scala beyond 2.12.8
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-16443FLINK-16443/a;]
 - Fix wrong fix for user-code CheckpointExceptions
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-19771FLINK-19771/a;]
 - NullPointerException when accessing null array from postgres in JDBC 
Connector
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-20309FLINK-20309/a;]
 - UnalignedCheckpointTestBase.execute is failed
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-20462FLINK-20462/a;]
 - MailboxOperatorTest.testAvoidTaskStarvation
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-20500FLINK-20500/a;]
 - UpsertKafkaTableITCase.testTemporalJoin test failed
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-20565FLINK-20565/a;]
 - Fix typo in EXPLAIN Statements docs.
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-20580FLINK-20580/a;]
 - Missing null value handling for SerializedValue#39;s 
getByteArray() 
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-20654FLINK-20654/a;]
 - Unaligned checkpoint recovery may lead to corrupted data stream
+/li
+li[a 

[flink-web] 04/04: Rebuild website

2020-07-06 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit e49106f80359283b6c620aaca255b3f5fb8cbf7c
Author: Piotr Nowojski 
AuthorDate: Mon Jul 6 15:04:50 2020 +0200

Rebuild website
---
 content/blog/page10/index.html | 4 ++--
 content/blog/page11/index.html | 4 ++--
 content/blog/page12/index.html | 4 ++--
 content/blog/page2/index.html  | 4 ++--
 content/blog/page3/index.html  | 4 ++--
 content/blog/page4/index.html  | 4 ++--
 content/blog/page5/index.html  | 4 ++--
 content/blog/page6/index.html  | 4 ++--
 content/blog/page7/index.html  | 4 ++--
 content/blog/page8/index.html  | 4 ++--
 content/blog/page9/index.html  | 4 ++--
 11 files changed, 22 insertions(+), 22 deletions(-)

diff --git a/content/blog/page10/index.html b/content/blog/page10/index.html
index 730edc2..5bc32c9 100644
--- a/content/blog/page10/index.html
+++ b/content/blog/page10/index.html
@@ -92,7 +92,7 @@
 
   Getting Started
   
-https://ci.apache.org/projects/flink/flink-docs-release-1.10/getting-started/index.html;
 target="_blank">With Flink 
+https://ci.apache.org/projects/flink/flink-docs-release-1.11/getting-started/index.html;
 target="_blank">With Flink 
 https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1/getting-started/project-setup.html;
 target="_blank">With Flink Stateful Functions 
 Training Course
   
@@ -102,7 +102,7 @@
 
   Documentation
   
-https://ci.apache.org/projects/flink/flink-docs-release-1.10; 
target="_blank">Flink 1.10 (Latest stable release) 
+https://ci.apache.org/projects/flink/flink-docs-release-1.11; 
target="_blank">Flink 1.11 (Latest stable release) 
 https://ci.apache.org/projects/flink/flink-docs-master; 
target="_blank">Flink Master (Latest Snapshot) 
 https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1; 
target="_blank">Flink Stateful Functions 2.1 (Latest stable release) 

 https://ci.apache.org/projects/flink/flink-statefun-docs-master; 
target="_blank">Flink Stateful Functions Master (Latest Snapshot) 
diff --git a/content/blog/page11/index.html b/content/blog/page11/index.html
index 9f9951f..2b4dfdf 100644
--- a/content/blog/page11/index.html
+++ b/content/blog/page11/index.html
@@ -92,7 +92,7 @@
 
   Getting Started
   
-https://ci.apache.org/projects/flink/flink-docs-release-1.10/getting-started/index.html;
 target="_blank">With Flink 
+https://ci.apache.org/projects/flink/flink-docs-release-1.11/getting-started/index.html;
 target="_blank">With Flink 
 https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1/getting-started/project-setup.html;
 target="_blank">With Flink Stateful Functions 
 Training Course
   
@@ -102,7 +102,7 @@
 
   Documentation
   
-https://ci.apache.org/projects/flink/flink-docs-release-1.10; 
target="_blank">Flink 1.10 (Latest stable release) 
+https://ci.apache.org/projects/flink/flink-docs-release-1.11; 
target="_blank">Flink 1.11 (Latest stable release) 
 https://ci.apache.org/projects/flink/flink-docs-master; 
target="_blank">Flink Master (Latest Snapshot) 
 https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1; 
target="_blank">Flink Stateful Functions 2.1 (Latest stable release) 

 https://ci.apache.org/projects/flink/flink-statefun-docs-master; 
target="_blank">Flink Stateful Functions Master (Latest Snapshot) 
diff --git a/content/blog/page12/index.html b/content/blog/page12/index.html
index b89c090..8d3a1d2 100644
--- a/content/blog/page12/index.html
+++ b/content/blog/page12/index.html
@@ -92,7 +92,7 @@
 
   Getting Started
   
-https://ci.apache.org/projects/flink/flink-docs-release-1.10/getting-started/index.html;
 target="_blank">With Flink 
+https://ci.apache.org/projects/flink/flink-docs-release-1.11/getting-started/index.html;
 target="_blank">With Flink 
 https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1/getting-started/project-setup.html;
 target="_blank">With Flink Stateful Functions 
 Training Course
   
@@ -102,7 +102,7 @@
 
   Documentation
   
-https://ci.apache.org/projects/flink/flink-docs-release-1.10; 
target="_blank">Flink 1.10 (Latest stable release) 
+https://ci.apache.org/projects/flink/flink-docs-release-1.11; 
target="_blank">Flink 1.11 (Latest stable release) 
 

[flink-web] 04/04: Rebuild website

2020-06-15 Thread fhueske
This is an automated email from the ASF dual-hosted git repository.

fhueske pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit ac41805001489cdf211456f131cca4c397188ec2
Author: Fabian Hueske 
AuthorDate: Mon Jun 15 16:19:40 2020 +0200

Rebuild website
---
 content/blog/feed.xml  | 135 +---
 content/blog/index.html|  43 ++-
 content/blog/page10/index.html |  36 ++-
 content/blog/page11/index.html |  38 ++-
 content/blog/page12/index.html |  25 ++
 content/blog/page2/index.html  |  39 ++-
 content/blog/page3/index.html  |  38 ++-
 content/blog/page4/index.html  |  38 ++-
 content/blog/page5/index.html  |  38 ++-
 content/blog/page6/index.html  |  40 ++-
 content/blog/page7/index.html  |  40 ++-
 content/blog/page8/index.html  |  40 ++-
 content/blog/page9/index.html  |  38 ++-
 .../2020-06-15-flink-on-zeppelin/create_sink.png   | Bin 0 -> 138803 bytes
 .../2020-06-15-flink-on-zeppelin/create_source.png | Bin 0 -> 147213 bytes
 .../img/blog/2020-06-15-flink-on-zeppelin/etl.png  | Bin 0 -> 55319 bytes
 .../blog/2020-06-15-flink-on-zeppelin/preview.png  | Bin 0 -> 89756 bytes
 content/index.html |  10 +-
 .../news/2020/06/15/flink-on-zeppelin-part1.html   | 351 +
 content/zh/index.html  |  10 +-
 20 files changed, 756 insertions(+), 203 deletions(-)

diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index 23b09f0..eabf57c 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -7,6 +7,102 @@
 https://flink.apache.org/blog/feed.xml; rel="self" 
type="application/rss+xml" />
 
 
+Flink on Zeppelin Notebooks for Interactive Data Analysis - Part 
1
+pThe latest release of a 
href=https://zeppelin.apache.org/Apache Zeppelin/a 
comes with a redesigned interpreter for Apache Flink (version Flink 1.10+ is 
only supported moving forward) 
+that allows developers to use Flink directly on Zeppelin notebooks for 
interactive data analysis. I wrote 2 posts about how to use Flink in Zeppelin. 
This is part-1 where I explain how the Flink interpreter in Zeppelin works, 
+and provide a tutorial for running Streaming ETL with Flink on 
Zeppelin./p
+
+h1 id=the-flink-interpreter-in-zeppelin-09The Flink 
Interpreter in Zeppelin 0.9/h1
+
+pThe Flink interpreter can be accessed and configured from Zeppelin’s 
interpreter settings page. 
+The interpreter has been refactored so that Flink users can now take advantage 
of Zeppelin to write Flink applications in three languages, 
+namely Scala, Python (PyFlink) and SQL (for both batch amp; streaming 
executions). 
+Zeppelin 0.9 now comes with the Flink interpreter group, consisting of the 
below five interpreters:/p
+
+ul
+  li%flink - Provides a Scala environment/li
+  li%flink.pyflink   - Provides a python environment/li
+  li%flink.ipyflink   - Provides an ipython environment/li
+  li%flink.ssql - Provides a stream sql environment/li
+  li%flink.bsql - Provides a batch sql environment/li
+/ul
+
+pNot only has the interpreter been extended to support writing Flink 
applications in three languages, but it has also extended the available 
execution modes for Flink that now include:/p
+
+ul
+  liRunning Flink in Local Mode/li
+  liRunning Flink in Remote Mode/li
+  liRunning Flink in Yarn Mode/li
+/ul
+
+pYou can find more information about how to get started with Zeppelin 
and all the execution modes for Flink applications in a 
href=https://github.com/apache/zeppelin/tree/master/notebook/Flink%20TutorialZeppelin
 notebooks/a in this post./p
+
+h1 id=flink-on-zeppelin-for-stream-processingFlink on 
Zeppelin for Stream processing/h1
+
+pPerforming stream processing jobs with Apache Flink on Zeppelin 
allows you to run most major streaming cases, 
+such as streaming ETL and real time data analytics, with the use of Flink SQL 
and specific UDFs. 
+Below we showcase how you can execute streaming ETL using Flink on 
Zeppelin:/p
+
+pYou can use Flink SQL to perform streaming ETL by following the steps 
below 
+(for the full tutorial, please refer to the a 
href=https://github.com/apache/zeppelin/blob/master/notebook/Flink%20Tutorial/4.%20Streaming%20ETL_2EYD56B9B.zplnFlink
 Tutorial/Streaming ETL tutorial/a of the Zeppelin 
distribution):/p
+
+ul
+  liStep 1. Create source table to represent the source 
data./li
+/ul
+
+center
+img 
src=/img/blog/2020-06-15-flink-on-zeppelin/create_source.png 
width=80% alt=Create Source Table /
+/center
+
+ul
+  liStep 2. Create a sink table to represent the processed 
data./li
+/ul
+
+center
+img src=/img/blog/2020-06-15-flink-on-zeppelin/create_sink.png 
width=80% alt=Create Sink Table /
+/center
+
+ul
+  liStep 

[flink-web] 04/04: Rebuild website

2019-09-03 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit b9e62de724f5c0cef8053f73d4108dc8199ac89e
Author: Chesnay Schepler 
AuthorDate: Mon Sep 2 11:07:09 2019 +0200

Rebuild website
---
 content/downloads.html| 74 +++
 content/zh/downloads.html | 74 +++
 2 files changed, 74 insertions(+), 74 deletions(-)

diff --git a/content/downloads.html b/content/downloads.html
index 08120bd..edd0747 100644
--- a/content/downloads.html
+++ b/content/downloads.html
@@ -461,7 +461,7 @@ Flink 1.9.0 - 2019-08-22
 https://archive.apache.org/dist/flink/flink-1.9.0;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.9;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.9/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.9/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.9/api/scala/index.html;>Scaladocs)
 
 
 
@@ -472,7 +472,7 @@ Flink 1.8.1 - 2019-07-02
 https://archive.apache.org/dist/flink/flink-1.8.1;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.8;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/scala/index.html;>Scaladocs)
 
 
 
@@ -483,7 +483,7 @@ Flink 1.8.0 - 2019-04-09
 https://archive.apache.org/dist/flink/flink-1.8.0;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.8;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/scala/index.html;>Scaladocs)
 
 
 
@@ -494,7 +494,7 @@ Flink 1.7.2 - 2019-02-15
 https://archive.apache.org/dist/flink/flink-1.7.2;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.7;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/scala/index.html;>Scaladocs)
 
 
 
@@ -505,7 +505,7 @@ Flink 1.7.1 - 2018-12-21
 https://archive.apache.org/dist/flink/flink-1.7.1;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.7;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/scala/index.html;>Scaladocs)
 
 
 
@@ -516,7 +516,7 @@ Flink 1.7.0 - 2018-11-30
 https://archive.apache.org/dist/flink/flink-1.7.0;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.7;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/scala/index.html;>Scaladocs)
 
 
 
@@ -527,7 +527,7 @@ Flink 1.6.4 - 2019-02-25
 https://archive.apache.org/dist/flink/flink-1.6.4;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/scala/index.html;>Scaladocs)
 
 
 
@@ -538,7 +538,7 @@ Flink 1.6.3 - 2018-12-22
 https://archive.apache.org/dist/flink/flink-1.6.3;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/scala/index.html;>Scaladocs)
 
 
 
@@ -549,7 +549,7 @@ Flink 1.6.2 - 2018-10-29
 https://archive.apache.org/dist/flink/flink-1.6.2;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6;>Docs, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/java;>Javadocs,
 
-https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/scala/index.html;>ScalaDocs)
+https://ci.apache.org/projects/flink/flink-docs-release-1.6/api/scala/index.html;>Scaladocs)
 
 
 
@@ -560,7 +560,7 @@ Flink 1.6.1 - 2018-09-19
 https://archive.apache.org/dist/flink/flink-1.6.1;>Binaries, 
 https://ci.apache.org/projects/flink/flink-docs-release-1.6;>Docs, 
 

[flink-web] 04/04: Rebuild website

2019-07-23 Thread fhueske
This is an automated email from the ASF dual-hosted git repository.

fhueske pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit dec96f63d784e24a20a86bf6ffd861cc3839c3e6
Author: Fabian Hueske 
AuthorDate: Tue Jul 23 18:43:53 2019 +0200

Rebuild website
---
 content/img/poweredby/didi-logo.png | Bin 0 -> 50878 bytes
 content/index.html  |   6 ++
 content/poweredby.html  |   5 +
 content/zh/index.html   |   6 ++
 content/zh/poweredby.html   |   5 +
 5 files changed, 22 insertions(+)

diff --git a/content/img/poweredby/didi-logo.png 
b/content/img/poweredby/didi-logo.png
new file mode 100644
index 000..9f62844
Binary files /dev/null and b/content/img/poweredby/didi-logo.png differ
diff --git a/content/index.html b/content/index.html
index f453f3f..e4f49f0 100644
--- a/content/index.html
+++ b/content/index.html
@@ -338,6 +338,12 @@
 
 
   
+
+  
+
+
+
+  
 
   
 
diff --git a/content/poweredby.html b/content/poweredby.html
index 82d358f..e4b89b7 100644
--- a/content/poweredby.html
+++ b/content/poweredby.html
@@ -199,6 +199,11 @@
   Criteo is the advertising platform for the open internet and uses Flink 
for real-time revenue monitoring and near-real-time event processing. https://medium.com/criteo-labs/criteo-streaming-flink-31816c08da50; 
target="_blank"> Learn about Criteo's Flink use case
   
   
+
+Didi Chuxing (“DiDi”), the world's leading mobile transportation 
platform, uses Apache Flink for real-time monitoring, feature extraction, and 
ETL.
+https://blog.didiyun.com/index.php/2018/12/05/realtime-compute/; 
target="_blank"> Learn about Didi's Flink use case
+  
+  
 
   Drivetribe, a digital community founded by the former hosts of “Top 
Gear”, uses Flink for metrics and content recommendations. https://ververica.com/blog/drivetribe-cqrs-apache-flink/; 
target="_blank"> Read about Flink in the Drivetribe 
stack
   
diff --git a/content/zh/index.html b/content/zh/index.html
index 12730e5..4945e11 100644
--- a/content/zh/index.html
+++ b/content/zh/index.html
@@ -337,6 +337,12 @@
 
 
   
+
+  
+
+
+
+  
 
   
 
diff --git a/content/zh/poweredby.html b/content/zh/poweredby.html
index 19905bc..b9aca63 100644
--- a/content/zh/poweredby.html
+++ b/content/zh/poweredby.html
@@ -197,6 +197,11 @@
   Criteo 是开放互联网的广告平台,使用 Flink 进行实时收入监控和近实时事件处理。https://medium.com/criteo-labs/criteo-streaming-flink-31816c08da50; 
target="_blank"> 了解 Criteo 的 Flink 用例
   
   
+
+滴滴出行是全球卓越的移动出行平台,使用 Apache Flink支持了实时监控、实时特征抽取、实时ETL等业务。
+https://blog.didiyun.com/index.php/2018/12/05/realtime-compute/; 
target="_blank"> 了解滴滴如何使用 Flink 的。
+  
+  
 
  Drivetribe是由前“Top Gear”主持人创建的数字社区,它使用 Flink 作为指标和内容推荐。https://ververica.com/blog/drivetribe-cqrs-apache-flink/; 
target="_blank"> 了解 Flink 在 Drivetribe stack 的应用
   



[flink-web] 04/04: Rebuild website

2019-02-20 Thread jark
This is an automated email from the ASF dual-hosted git repository.

jark pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 483d732103349b336b69d10e50d408630338f38e
Author: 云邪 
AuthorDate: Wed Feb 20 16:39:08 2019 +0800

Rebuild website
---
 content/zh/faq.html   |  87 ++
 content/zh/how-to-contribute.html | 150 +++---
 2 files changed, 112 insertions(+), 125 deletions(-)

diff --git a/content/zh/faq.html b/content/zh/faq.html
index 7a92808..017314b 100644
--- a/content/zh/faq.html
+++ b/content/zh/faq.html
@@ -169,88 +169,77 @@ under the License.
 
 
 
-The following questions are frequently asked with regard to the Flink 
project in general.
+以下这些是 Flink 项目中经常会被问到的常见问题。
 
-If you have further questions, make sure to consult the https://ci.apache.org/projects/flink/flink-docs-release-1.7;>documentation
 or ask the community.
+如果你还有其他问题,请先查阅https://ci.apache.org/projects/flink/flink-docs-release-1.7;>文档或咨询社区。
 
 
 
-  General
-  Is
 Apache Flink only for (near) real-time processing use cases?
-  If
 everything is a stream, why are there a DataStream and a DataSet API in 
Flink?
-  How does Flink 
relate to the Hadoop Stack?
-  What other stacks does 
Flink run in?
-  What are the 
prerequisites to use Flink?
-  What scale does Flink 
support?
-  Is Flink limited to 
in-memory data sets?
+  常见问题
+  Apache 
Flink 仅适用于(近)实时处理场景吗?
+  如果一切都是流,为什么 Flink 中同时有 
DataStream 和 DataSet API?
+  Flink 与 
Hadoop 软件栈是什么关系?
+  Flink 
运行的其他软件栈是什么?
+  使用Flink的先决条件是什么?
+  Flink支持多大的规模?
+  Flink是否仅限于内存数据集?
 
   
-  Common Error Messages
+  常见错误消息
 
 
 
 
-General
+常见问题
 
-Is 
Apache Flink only for (near) real-time processing use cases?
+Apache Flink 仅适用于(近)实时处理场景吗?
 
-Flink is a very general system for data processing and data-driven 
applications with data streams as
-the core building block. These data streams can be streams of real-time data, 
or stored streams of historic data.
-For example, in Flink’s view a file is a stored stream of bytes. Because of 
that, Flink
-supports both real-time data processing and applications, as well as batch 
processing applications.
+Flink 是一个非常通用的系统,它以 数据流 
为核心,用于数据处理和数据驱动的应用程序。这些数据流可以是实时数据流或存储的历史数据流。例如,Flink 认为文件是存储的字节流。因此,Flink 
同时支持实时数据处理和批处理应用程序。
 
-Streams can be unbounded (have no end, events continuously keep 
coming) or be bounded (streams have a beginning
-and an end). For example, a Twitter feed or a stream of events from a message 
queue are generally unbounded streams,
-whereas a stream of bytes from a file is a bounded stream.
+流可以是 无界的 (不会结束,源源不断地发生事件)或 有界的 (流有开始和结束)。例如,来自消息队列的 
Twitter 信息流或事件流通常是无界的流,而来自文件的字节流是有界的流。
 
-If
 everything is a stream, why are there a DataStream and a DataSet API in 
Flink?
+如果一切都是流,为什么 Flink 中同时有 DataStream 和 
DataSet API?
 
-Bounded streams are often more efficient to process than unbounded streams. 
Processing unbounded streams of events
-in (near) real-time requires the system to be able to immediately act on 
events and to produce intermediate
-results (often with low latency). Processing bounded streams usually does not 
require producing low latency results, because the data is a while old
-anyway (in relative terms). That allows Flink to process the data in a simple 
and more efficient way.
+处理有界流的数据通常比无界流更有效。在(近)实时要求的系统中,处理无限的事件流要求系统能够立即响应事件并产生中间结果(通常具有低延迟)。处理有界流通常不需要产生低延迟结果,因为无论如何数据都有点旧(相对而言)。这样
 Flink 就能以更加简单有效的方式去处理数据。
 
-The DataStream API captures the continuous processing of unbounded 
and bounded streams, with a model that supports
-low latency results and flexible reaction to events and time (including event 
time).
+DataStream API 基于一个支持低延迟和对事件和时间(包括事件时间)灵活反应的模型,用来连续处理无界流和有界流。
 
-The DataSet API has techniques that often speed up the processing 
of bounded data streams. In the future, the community
-plans to combine these optimizations with the techniques in the DataStream 
API.
+DataSet API 具有通常可加速有界数据流处理的技术。在未来,社区计划将这些优化与 DataStream API 
中的技术相结合。
 
-How does Flink relate to 
the Hadoop Stack?
+Flink 与 Hadoop 软件栈是什么关系?
 
-Flink is independent of https://hadoop.apache.org/;>Apache 
Hadoop and runs without any Hadoop dependencies.
+Flink 独立于https://hadoop.apache.org/;>Apache Hadoop,且能在没有任何 
Hadoop 依赖的情况下运行。
 
-However, Flink integrates very well with many Hadoop components, for 
example, HDFS, YARN, or HBase.
-When running together with these components, Flink can use HDFS to read data, 
or write results and checkpoints/snapshots.
-Flink can be easily deployed via YARN and integrates with the YARN and HDFS 
Kerberos security modules.
+但是,Flink 可以很好的集成很多 Hadoop 组件,例如 HDFS、YARN 或 
HBase。
+当与这些组件一起运行时,Flink 可以从 HDFS 读取数据,或写入结果和检查点(checkpoint)/快照(snapshot)数据到 HDFS 。
+Flink 还可以通过 YARN 轻松部署,并与 YARN 和 HDFS Kerberos 安全模块集成。
 
-What other stacks does Flink run 
in?
+Flink 运行的其他软件栈是什么?
 
-Users run Flink on https://kubernetes.io;>Kubernetes,