buildbot success in on flink-docs-release-1.7

2019-08-30 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-release-1.7 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-1.7/builds/277

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: bb_slave2_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-1.7' 
triggered this build
Build Source Stamp: [branch release-1.7] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





buildbot success in on flink-docs-release-1.8

2019-08-30 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-release-1.8 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-1.8/builds/166

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: bb_slave2_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-1.8' 
triggered this build
Build Source Stamp: [branch release-1.8] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





buildbot success in on flink-docs-release-1.9

2019-08-30 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-release-1.9 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-1.9/builds/11

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: bb_slave2_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-1.9' 
triggered this build
Build Source Stamp: [branch release-1.9] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





Build failed in Jenkins: flink-snapshot-deployment-1.7 #295

2019-08-30 Thread Apache Jenkins Server
See 


--
[...truncated 424.24 KB...]
[INFO] Excluding org.objenesis:objenesis:jar:2.1 from the shaded jar.
[INFO] No artifact matching filter io.netty:netty
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] >>> maven-source-plugin:2.2.1:jar (attach-sources) > generate-sources @ 
flink-scala_2.11 >>>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.17:check (validate) @ flink-scala_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-maven-version) @ 
flink-scala_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-maven) @ 
flink-scala_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-versions) @ 
flink-scala_2.11 ---
[INFO] 
[INFO] --- directory-maven-plugin:0.1:highest-basedir (directories) @ 
flink-scala_2.11 ---
[INFO] Highest basedir set to: 

[INFO] 
[INFO] --- build-helper-maven-plugin:1.7:add-source (add-source) @ 
flink-scala_2.11 ---
[INFO] Source directory: 

 added.
[INFO] 
[INFO] <<< maven-source-plugin:2.2.1:jar (attach-sources) < generate-sources @ 
flink-scala_2.11 <<<
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar (attach-sources) @ flink-scala_2.11 ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-javadoc-plugin:2.9.1:jar (attach-javadocs) @ flink-scala_2.11 
---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (integration-tests) @ 
flink-scala_2.11 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalastyle-maven-plugin:1.0.0:check (default) @ flink-scala_2.11 ---
Saving to 
outputFile=
Processed 103 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 1518 ms
[INFO] 
[INFO] --- japicmp-maven-plugin:0.11.0:cmp (default) @ flink-scala_2.11 ---
[INFO] Written file 
'
[INFO] Written file 
'
[INFO] Written file 
'
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ 
flink-scala_2.11 ---
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-scala_2.11/1.7-SNAPSHOT/flink-scala_2.11-1.7-SNAPSHOT.jar
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-scala_2.11/1.7-SNAPSHOT/flink-scala_2.11-1.7-SNAPSHOT.pom
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-scala_2.11/1.7-SNAPSHOT/flink-scala_2.11-1.7-SNAPSHOT-tests.jar
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-scala_2.11/1.7-SNAPSHOT/flink-scala_2.11-1.7-SNAPSHOT-tests.jar
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-scala_2.11/1.7-SNAPSHOT/flink-scala_2.11-1.7-SNAPSHOT-sources.jar
[INFO] Installing 

buildbot success in on flink-docs-master

2019-08-30 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-master while 
building . Full details are available at:
https://ci.apache.org/builders/flink-docs-master/builds/1581

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: bb_slave2_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-master' triggered 
this build
Build Source Stamp: [branch master] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





[flink] branch master updated: [FLINK-13903][hive] Support Hive version 2.3.6

2019-08-30 Thread bli
This is an automated email from the ASF dual-hosted git repository.

bli pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new edf1f7d  [FLINK-13903][hive] Support Hive version 2.3.6
edf1f7d is described below

commit edf1f7d610c75adb228b8bc8262eb1865a91e362
Author: Xuefu Zhang 
AuthorDate: Fri Aug 30 12:02:18 2019 -0700

[FLINK-13903][hive] Support Hive version 2.3.6

Support Hive 2.3.6, which was released a few days ago.

This closes #9584.
---
 .../table/catalog/hive/client/HiveShimLoader.java  |  4 
 .../table/catalog/hive/client/HiveShimV236.java| 26 ++
 .../connectors/hive/HiveRunnerShimLoader.java  |  1 +
 3 files changed, 31 insertions(+)

diff --git 
a/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimLoader.java
 
b/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimLoader.java
index a45722d..802f2bd 100644
--- 
a/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimLoader.java
+++ 
b/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimLoader.java
@@ -43,6 +43,7 @@ public class HiveShimLoader {
public static final String HIVE_VERSION_V2_3_3 = "2.3.3";
public static final String HIVE_VERSION_V2_3_4 = "2.3.4";
public static final String HIVE_VERSION_V2_3_5 = "2.3.5";
+   public static final String HIVE_VERSION_V2_3_6 = "2.3.6";
 
private static final Map hiveShims = new 
ConcurrentHashMap<>(2);
 
@@ -86,6 +87,9 @@ public class HiveShimLoader {
if (v.startsWith(HIVE_VERSION_V2_3_5)) {
return new HiveShimV235();
}
+   if (v.startsWith(HIVE_VERSION_V2_3_6)) {
+   return new HiveShimV236();
+   }
throw new CatalogException("Unsupported Hive version " 
+ v);
});
}
diff --git 
a/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimV236.java
 
b/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimV236.java
new file mode 100644
index 000..f9832c1
--- /dev/null
+++ 
b/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/client/HiveShimV236.java
@@ -0,0 +1,26 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.catalog.hive.client;
+
+/**
+ * Shim for Hive version 2.3.6.
+ */
+public class HiveShimV236 extends HiveShimV235 {
+
+}
diff --git 
a/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveRunnerShimLoader.java
 
b/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveRunnerShimLoader.java
index 6903cf6..ab0ee80 100644
--- 
a/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveRunnerShimLoader.java
+++ 
b/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveRunnerShimLoader.java
@@ -49,6 +49,7 @@ public class HiveRunnerShimLoader {
case HiveShimLoader.HIVE_VERSION_V2_3_3:
case HiveShimLoader.HIVE_VERSION_V2_3_4:
case HiveShimLoader.HIVE_VERSION_V2_3_5:
+   case HiveShimLoader.HIVE_VERSION_V2_3_6:
return new HiveRunnerShimV4();
default:
throw new RuntimeException("Unsupported 
Hive version " + v);



[flink] branch master updated: [FLINK-12847][kinesis] update flink-connector-kinesis to use Apache 2.0 license code

2019-08-30 Thread bli
This is an automated email from the ASF dual-hosted git repository.

bli pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 08d446c  [FLINK-12847][kinesis] update flink-connector-kinesis to use 
Apache 2.0 license code
08d446c is described below

commit 08d446c2176ff586e3c74c61af3f6797614f051e
Author: Dyana Rose 
AuthorDate: Wed Jul 31 10:29:38 2019 +0100

[FLINK-12847][kinesis] update flink-connector-kinesis to use Apache 2.0 
license code

The Kinesis Connector will now be able to be built and included in the 
build artefacts as it no longer pulls in any Amazon licensed code.

This closes #9494.
---
 .travis.yml| 118 ++---
 docs/dev/connectors/kinesis.md |  28 +
 docs/dev/connectors/kinesis.zh.md  |  27 +
 flink-connectors/flink-connector-kinesis/pom.xml   |  24 +++--
 .../src/main/resources/META-INF/NOTICE |  22 ++--
 .../resources/META-INF/licenses/LICENSE.amazon |  39 ---
 flink-connectors/pom.xml   |  18 +---
 flink-end-to-end-tests/pom.xml |  22 +---
 tools/travis/stage.sh  |   5 +-
 9 files changed, 94 insertions(+), 209 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index 8db3525..f58873d 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -81,179 +81,179 @@ jobs:
 - if: type in (pull_request, push)
   stage: compile
   script: ./tools/travis_controller.sh compile
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: compile
 - if: type in (pull_request, push)
   stage: test
   script: ./tools/travis_controller.sh core
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: core
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh python
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: python
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh libraries
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: libraries
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh blink_planner
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: blink_planner
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh connectors
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: connectors
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh kafka/gelly
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: kafka/gelly
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh tests
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: tests
 - if: type in (pull_request, push)
   script: ./tools/travis_controller.sh misc
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: misc
 - if: type in (pull_request, push)
   stage: cleanup
   script: ./tools/travis_controller.sh cleanup
-  env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis 
-Dinclude_hadoop_aws -Dscala-2.11"
+  env: PROFILE="-Dhadoop.version=2.8.3 -Dinclude_hadoop_aws -Dscala-2.11"
   name: cleanup
 # hadoop 2.4.1 profile
 - if: type = cron
   stage: compile
   script: ./tools/travis_controller.sh compile
-  env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -Pskip-hive-tests"
+  env: PROFILE="-Dhadoop.version=2.4.1 -Pskip-hive-tests"
   name: compile - hadoop 2.4.1
 - if: type = cron
   stage: test
   script: ./tools/travis_controller.sh core
-  env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis -Pskip-hive-tests"
+  env: 

[flink] branch master updated: [FLINK-13814][hive] HiveTableSink should strip quotes from partition values

2019-08-30 Thread bli
This is an automated email from the ASF dual-hosted git repository.

bli pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 07dfffe  [FLINK-13814][hive] HiveTableSink should strip quotes from 
partition values
07dfffe is described below

commit 07dfffec8248788630bff5e99afe9866d8b50487
Author: Rui Li 
AuthorDate: Thu Aug 22 14:53:26 2019 +0800

[FLINK-13814][hive] HiveTableSink should strip quotes from partition values

Strip quotes from partition value in order to get proper string values.

This closes #9502.
---
 .../flink/connectors/hive/HiveTableSinkTest.java   | 57 --
 .../connectors/hive/TableEnvHiveConnectorTest.java | 52 
 .../apache/flink/sql/parser/dml/RichSqlInsert.java |  7 ++-
 3 files changed, 57 insertions(+), 59 deletions(-)

diff --git 
a/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveTableSinkTest.java
 
b/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveTableSinkTest.java
index 13bddc0..51a56fb 100644
--- 
a/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveTableSinkTest.java
+++ 
b/flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveTableSinkTest.java
@@ -27,7 +27,6 @@ import org.apache.flink.table.api.DataTypes;
 import org.apache.flink.table.api.Table;
 import org.apache.flink.table.api.TableEnvironment;
 import org.apache.flink.table.api.TableSchema;
-import org.apache.flink.table.catalog.CatalogPartitionSpec;
 import org.apache.flink.table.catalog.CatalogTable;
 import org.apache.flink.table.catalog.CatalogTableImpl;
 import org.apache.flink.table.catalog.ObjectPath;
@@ -41,7 +40,6 @@ import org.apache.flink.types.Row;
 import com.klarna.hiverunner.HiveShell;
 import com.klarna.hiverunner.annotations.HiveSQL;
 import org.apache.hadoop.hive.conf.HiveConf;
-import org.apache.hadoop.mapred.JobConf;
 import org.junit.AfterClass;
 import org.junit.BeforeClass;
 import org.junit.Test;
@@ -107,31 +105,6 @@ public class HiveTableSinkTest {
}
 
@Test
-   public void testInsertIntoDynamicPartition() throws Exception {
-   String dbName = "default";
-   String tblName = "dest";
-   RowTypeInfo rowTypeInfo = createDestTable(dbName, tblName, 1);
-   ObjectPath tablePath = new ObjectPath(dbName, tblName);
-
-   TableEnvironment tableEnv = HiveTestUtils.createTableEnv();
-
-   List toWrite = generateRecords(5);
-   Table src = tableEnv.fromTableSource(new 
CollectionTableSource(toWrite, rowTypeInfo));
-   tableEnv.registerTable("src", src);
-
-   tableEnv.registerCatalog("hive", hiveCatalog);
-   tableEnv.sqlQuery("select * from src").insertInto("hive", 
"default", "dest");
-   tableEnv.execute("mytest");
-
-   List partitionSpecs = 
hiveCatalog.listPartitions(tablePath);
-   assertEquals(toWrite.size(), partitionSpecs.size());
-
-   verifyWrittenData(toWrite, hiveShell.executeQuery("select * 
from " + tblName));
-
-   hiveCatalog.dropTable(tablePath, false);
-   }
-
-   @Test
public void testWriteComplexType() throws Exception {
String dbName = "default";
String tblName = "dest";
@@ -213,36 +186,6 @@ public class HiveTableSinkTest {
hiveCatalog.dropTable(tablePath, false);
}
 
-   @Test
-   public void testInsertIntoStaticPartition() throws Exception {
-   String dbName = "default";
-   String tblName = "dest";
-   RowTypeInfo rowTypeInfo = createDestTable(dbName, tblName, 1);
-   ObjectPath tablePath = new ObjectPath(dbName, tblName);
-
-   TableEnvironment tableEnv = HiveTestUtils.createTableEnv();
-   List toWrite = generateRecords(1);
-   Table src = tableEnv.fromTableSource(new 
CollectionTableSource(toWrite, rowTypeInfo));
-   tableEnv.registerTable("src", src);
-
-   Map partSpec = new HashMap<>();
-   partSpec.put("s", "a");
-
-   CatalogTable table = (CatalogTable) 
hiveCatalog.getTable(tablePath);
-   HiveTableSink hiveTableSink = new HiveTableSink(new 
JobConf(hiveConf), tablePath, table);
-   hiveTableSink.setStaticPartition(partSpec);
-   tableEnv.registerTableSink("destSink", hiveTableSink);
-   tableEnv.sqlQuery("select * from src").insertInto("destSink");
-   tableEnv.execute("mytest");
-
-   // make sure new partition is created
-   assertEquals(toWrite.size(), 
hiveCatalog.listPartitions(tablePath).size());
-
-   verifyWrittenData(toWrite, 

[flink] branch master updated (ce55783 -> 403295f)

2019-08-30 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from ce55783  [FLINK-13051][runtime] Rename TwoInputSelectableStreamTask 
and StreamTwoInputSelectableProcessor
 add 403295f  [hotfix] Correct broken link in sql.zh.md

No new revisions were added by this update.

Summary of changes:
 docs/dev/table/sql.zh.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink-web] 01/02: Add missing flink-shaded-8.0 entry to chinese downloads page

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 89b72bcc8700c85b169ca549fd8537b07b2146ce
Author: Chesnay Schepler 
AuthorDate: Fri Aug 30 13:01:55 2019 +0200

Add missing flink-shaded-8.0 entry to chinese downloads page
---
 downloads.zh.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/downloads.zh.md b/downloads.zh.md
index d40cf99..0851340 100644
--- a/downloads.zh.md
+++ b/downloads.zh.md
@@ -233,6 +233,7 @@ flink-docs-release-{{ flink_release.version_short 
}}/release-notes/flink-{{ flin
 - Flink 0.6-incubating - 2014-08-26 
([Source](https://archive.apache.org/dist/incubator/flink/flink-0.6-incubating-src.tgz),
 [Binaries](https://archive.apache.org/dist/incubator/flink/))
 
 ### Flink-shaded
+- Flink-shaded 8.0 - 2019-08-28 
([Source](https://archive.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz))
 - Flink-shaded 7.0 - 2019-05-30 
([Source](https://archive.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz))
 - Flink-shaded 6.0 - 2019-02-12 
([Source](https://archive.apache.org/dist/flink/flink-shaded-6.0/flink-shaded-6.0-src.tgz))
 - Flink-shaded 5.0 - 2018-10-15 
([Source](https://archive.apache.org/dist/flink/flink-shaded-5.0/flink-shaded-5.0-src.tgz))



[flink-web] branch asf-site updated (06e0f0f -> c520e37)

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from 06e0f0f  Rebuild website
 new 89b72bc  Add missing flink-shaded-8.0 entry to chinese downloads page
 new c520e37  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 content/zh/downloads.html | 1 +
 downloads.zh.md   | 1 +
 2 files changed, 2 insertions(+)



[flink-web] 02/02: Rebuild website

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit c520e375fc34b5655b7d7c6382cefec444c653bb
Author: Chesnay Schepler 
AuthorDate: Fri Aug 30 13:02:14 2019 +0200

Rebuild website
---
 content/zh/downloads.html | 1 +
 1 file changed, 1 insertion(+)

diff --git a/content/zh/downloads.html b/content/zh/downloads.html
index 71d9412..8489e08 100644
--- a/content/zh/downloads.html
+++ b/content/zh/downloads.html
@@ -514,6 +514,7 @@ flink-docs-release-1.6/release-notes/flink-1.6.html">Flink 
1.9 的发布说明Flink-shaded
 
+  Flink-shaded 8.0 - 2019-08-28 (https://archive.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz;>Source)
   Flink-shaded 7.0 - 2019-05-30 (https://archive.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz;>Source)
   Flink-shaded 6.0 - 2019-02-12 (https://archive.apache.org/dist/flink/flink-shaded-6.0/flink-shaded-6.0-src.tgz;>Source)
   Flink-shaded 5.0 - 2018-10-15 (https://archive.apache.org/dist/flink/flink-shaded-5.0/flink-shaded-5.0-src.tgz;>Source)



[flink-web] branch asf-site updated: Rebuild website

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 06e0f0f  Rebuild website
06e0f0f is described below

commit 06e0f0f67c165a490debfaefabe3ca41e5d851c1
Author: Chesnay Schepler 
AuthorDate: Fri Aug 30 12:59:24 2019 +0200

Rebuild website
---
 content/downloads.html| 5 +++--
 content/zh/downloads.html | 4 ++--
 2 files changed, 5 insertions(+), 4 deletions(-)

diff --git a/content/downloads.html b/content/downloads.html
index a5b20eb..5804511 100644
--- a/content/downloads.html
+++ b/content/downloads.html
@@ -408,8 +408,8 @@ HADOOP_CLASSPATH.
 main Flink release:
 
 
-https://www.apache.org/dyn/closer.lua/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz;
 class="ga-track" id="s70-download-source">Apache Flink-shaded 7.0
-(https://www.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz.asc;>asc,
 https://www.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz.sha512;>sha512)
+https://www.apache.org/dyn/closer.lua/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz;
 class="ga-track" id="s80-download-source">Apache Flink-shaded 8.0
+(https://www.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz.asc;>asc,
 https://www.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz.sha512;>sha512)
 
 
 Verifying Hashes and Signatures
@@ -504,6 +504,7 @@ main Flink release:
 
 Flink-shaded
 
+  Flink-shaded 8.0 - 2019-08-28 (https://archive.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz;>Source)
   Flink-shaded 7.0 - 2019-05-30 (https://archive.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz;>Source)
   Flink-shaded 6.0 - 2019-02-12 (https://archive.apache.org/dist/flink/flink-shaded-6.0/flink-shaded-6.0-src.tgz;>Source)
   Flink-shaded 5.0 - 2018-10-15 (https://archive.apache.org/dist/flink/flink-shaded-5.0/flink-shaded-5.0-src.tgz;>Source)
diff --git a/content/zh/downloads.html b/content/zh/downloads.html
index 7d8cf6b..71d9412 100644
--- a/content/zh/downloads.html
+++ b/content/zh/downloads.html
@@ -422,8 +422,8 @@ flink-docs-release-1.6/release-notes/flink-1.6.html">Flink 
1.9 的发布说明其他不包含在 Flink 的主要发布的组件如下所示:
 
 
-https://www.apache.org/dyn/closer.lua/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz;
 class="ga-track" id="s70-download-source">Apache Flink-shaded 7.0
-(https://www.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz.asc;>asc,
 https://www.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz.sha512;>sha512)
+https://www.apache.org/dyn/closer.lua/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz;
 class="ga-track" id="s80-download-source">Apache Flink-shaded 8.0
+(https://www.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz.asc;>asc,
 https://www.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz.sha512;>sha512)
 
 
 验证哈希和签名



[flink-web] branch asf-site updated: Add flink-shaded 8.0

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 660aa9e  Add flink-shaded 8.0
660aa9e is described below

commit 660aa9ef89b29025040154e942471e1ff35649c0
Author: zentol 
AuthorDate: Fri Aug 23 20:50:33 2019 +0200

Add flink-shaded 8.0
---
 _config.yml  | 10 +-
 downloads.md |  1 +
 2 files changed, 6 insertions(+), 5 deletions(-)

diff --git a/_config.yml b/_config.yml
index edf3e14..d7dc5c4 100644
--- a/_config.yml
+++ b/_config.yml
@@ -336,11 +336,11 @@ flink_releases:
 component_releases:
   -
   source_release:
-  name: "Apache Flink-shaded 7.0"
-  id: "s70-download-source"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz.sha512;
+  name: "Apache Flink-shaded 8.0"
+  id: "s80-download-source"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz.sha512;
 
 
 # Version numbers used in the text for stable and snapshot versions,
diff --git a/downloads.md b/downloads.md
index 4d73c27..b527ded 100644
--- a/downloads.md
+++ b/downloads.md
@@ -226,6 +226,7 @@ All Flink releases are available via 
[https://archive.apache.org/dist/flink/](ht
 - Flink 0.6-incubating - 2014-08-26 
([Source](https://archive.apache.org/dist/incubator/flink/flink-0.6-incubating-src.tgz),
 [Binaries](https://archive.apache.org/dist/incubator/flink/))
 
 ### Flink-shaded
+- Flink-shaded 8.0 - 2019-08-28 
([Source](https://archive.apache.org/dist/flink/flink-shaded-8.0/flink-shaded-8.0-src.tgz))
 - Flink-shaded 7.0 - 2019-05-30 
([Source](https://archive.apache.org/dist/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz))
 - Flink-shaded 6.0 - 2019-02-12 
([Source](https://archive.apache.org/dist/flink/flink-shaded-6.0/flink-shaded-6.0-src.tgz))
 - Flink-shaded 5.0 - 2018-10-15 
([Source](https://archive.apache.org/dist/flink/flink-shaded-5.0/flink-shaded-5.0-src.tgz))



[flink] branch master updated (ac0799a -> ce55783)

2019-08-30 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from ac0799a  [hotfix][test] Prefer propogate exceptions
 add dc74248  [FLINK-13051][runtime] Replace the non-selectable stream task 
with the input-selectable one
 add 8ca6a09  [hotfix][runtime] Remove the unused 
StreamOperatorFactory#isOperatorSelectiveReading method
 add 6779bd6  [hotfix][runtime] Clean up unnecessary type argument 
declarations in TwoInputStreamTaskTest and TwoInputStreamTaskTestHarness
 add d23e237  [hotfix][runtime] Clean up unused member variables in 
StreamTwoInputSelectableProcessor
 add 7e21ec8  [FLINK-13051][runtime] Drop the non-selectable StreamTask and 
InputProcessor
 add ce55783  [FLINK-13051][runtime] Rename TwoInputSelectableStreamTask 
and StreamTwoInputSelectableProcessor

No new revisions were added by this update.

Summary of changes:
 .../flink/streaming/api/graph/StreamGraph.java |   5 +-
 .../api/operators/SimpleOperatorFactory.java   |   5 -
 .../api/operators/StreamOperatorFactory.java   |   5 -
 .../runtime/io/StreamTwoInputProcessor.java| 577 +
 .../io/StreamTwoInputSelectableProcessor.java  | 425 ---
 .../runtime/io/TwoInputSelectionHandler.java   |  15 +-
 .../tasks/TwoInputSelectableStreamTask.java|  72 ---
 .../runtime/tasks/TwoInputStreamTask.java  |  15 +-
 .../tasks/StreamTaskSelectiveReadingTest.java  |   2 +-
 .../runtime/tasks/TwoInputStreamTaskTest.java  | 181 ++-
 .../tasks/TwoInputStreamTaskTestHarness.java   |  25 +-
 .../runtime/operators/CodeGenOperatorFactory.java  |   6 -
 .../operators/join/Int2HashJoinOperatorTest.java   |   7 +-
 .../runtime/StreamTaskSelectiveReadingITCase.java  |   2 +-
 14 files changed, 349 insertions(+), 993 deletions(-)
 delete mode 100644 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/io/StreamTwoInputSelectableProcessor.java
 delete mode 100644 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/TwoInputSelectableStreamTask.java



[flink] branch java11_test updated: remove test scope

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch java11_test
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/java11_test by this push:
 new 9a581f5  remove test scope
9a581f5 is described below

commit 9a581f5db464432008727e68da205e7cb6451d15
Author: Chesnay Schepler 
AuthorDate: Fri Aug 30 10:58:32 2019 +0200

remove test scope
---
 flink-filesystems/flink-fs-hadoop-shaded/pom.xml | 1 -
 1 file changed, 1 deletion(-)

diff --git a/flink-filesystems/flink-fs-hadoop-shaded/pom.xml 
b/flink-filesystems/flink-fs-hadoop-shaded/pom.xml
index e72261b..8dd1f89 100644
--- a/flink-filesystems/flink-fs-hadoop-shaded/pom.xml
+++ b/flink-filesystems/flink-fs-hadoop-shaded/pom.xml
@@ -273,7 +273,6 @@ under the License.
javax.xml.bind
jaxb-api
2.3.0
-   test






[flink] branch master updated (d60fca5 -> ac0799a)

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from d60fca5  Revert "[FLINK-13729][docs] Update website generation 
dependencies"
 add a02f7f8  [FLINK-13828][configuration] Deprecate 
ConfigConstants.LOCAL_START_WEBSERVER
 add ac0799a  [hotfix][test] Prefer propogate exceptions

No new revisions were added by this update.

Summary of changes:
 .../flink/configuration/ConfigConstants.java   |  4 +
 .../flink/api/java/ExecutionEnvironment.java   |  3 -
 .../runtime/webmonitor/WebFrontendITCase.java  | 94 +-
 .../environment/StreamExecutionEnvironment.java|  3 -
 4 files changed, 42 insertions(+), 62 deletions(-)



[flink] branch master updated (b770bcd -> d60fca5)

2019-08-30 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from b770bcd  [FLINK-13548][tests] Add dedicated test case for priority 
scheduling
 add 17ecac1  Revert "[FLINK-13791][docs] Speed up sidenav by using 
group_by"
 add ba8bf02  Revert "[FLINK-13726][docs] Build docs with jekyll 
4.0.0.pre.beta1"
 add fc382d2  Revert "[hotfix][docs] Temporarily disable liveserve"
 add 13d19b2  Revert "[FLINK-13725][docs] use sassc for faster doc 
generation"
 add d60fca5  Revert "[FLINK-13729][docs] Update website generation 
dependencies"

No new revisions were added by this update.

Summary of changes:
 docs/.gitignore |  3 +-
 docs/Gemfile| 19 +-
 docs/Gemfile.lock   | 86 +
 docs/README.md  |  3 +-
 docs/_includes/sidenav.html |  7 ++--
 docs/build_docs.sh  |  2 +-
 6 files changed, 55 insertions(+), 65 deletions(-)



[flink] branch master updated (60b65c4 -> b770bcd)

2019-08-30 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 60b65c4  [FLINK-13514] Fix instability in 
StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge
 add a9d34dd  [FLINK-13548][Deployment/YARN]Support priority of the Flink 
YARN application.
 add ba70643  [hotfix] Create YarnTestUtils in flink-yarn
 add b770bcd  [FLINK-13548][tests] Add dedicated test case for priority 
scheduling

No new revisions were added by this update.

Summary of changes:
 .../generated/yarn_config_configuration.html   |  5 ++
 docs/ops/deployment/yarn_setup.md  |  9 +++
 docs/ops/deployment/yarn_setup.zh.md   |  9 +++
 .../yarn/YARNSessionCapacitySchedulerITCase.java   | 10 ---
 .../flink/yarn/YarnPrioritySchedulingITCase.java   | 85 ++
 .../java/org/apache/flink/yarn/YarnTestBase.java   | 14 
 .../flink/yarn/AbstractYarnClusterDescriptor.java  |  8 ++
 .../yarn/configuration/YarnConfigOptions.java  | 16 
 ...sterApplicationMasterResponseReflectorTest.java |  8 +-
 .../java/org/apache/flink/yarn/YarnTestUtils.java  | 28 +++
 10 files changed, 162 insertions(+), 30 deletions(-)
 create mode 100644 
flink-yarn-tests/src/test/java/org/apache/flink/yarn/YarnPrioritySchedulingITCase.java
 copy 
flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/overlays/ContainerOverlay.java
 => flink-yarn/src/test/java/org/apache/flink/yarn/YarnTestUtils.java (59%)



[flink] branch release-1.9 updated: [FLINK-13514] Fix instability in StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge

2019-08-30 Thread aljoscha
This is an automated email from the ASF dual-hosted git repository.

aljoscha pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 3d71518  [FLINK-13514] Fix instability in 
StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge
3d71518 is described below

commit 3d71518ef9c96bc8fe0add3b4c25bf141aa599db
Author: Aljoscha Krettek 
AuthorDate: Thu Aug 29 16:33:33 2019 +0200

[FLINK-13514] Fix instability in 
StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge

Before, thread pool shutdown would interrupt our waiting method.
Production code cannot throw an InterruptedException here and would also
not be correct if one is thrown.

We now swallow interrupted exceptions and wait until we successfully
return from await().
---
 .../flink/streaming/runtime/tasks/StreamTaskTest.java| 16 ++--
 1 file changed, 14 insertions(+), 2 deletions(-)

diff --git 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
index 2259501..d0295f1 100644
--- 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
+++ 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
@@ -482,11 +482,23 @@ public class StreamTaskTest extends TestLogger {
CheckpointResponder checkpointResponder = 
mock(CheckpointResponder.class);
doAnswer(new Answer() {
@Override
-   public Object answer(InvocationOnMock invocation) 
throws Throwable {
+   public Object answer(InvocationOnMock invocation) {
acknowledgeCheckpointLatch.trigger();
 
// block here so that we can issue the 
concurrent cancel call
-   completeAcknowledge.await();
+   while (true) {
+   try {
+   // wait until we successfully 
await (no pun intended)
+   completeAcknowledge.await();
+
+   // when await() returns 
normally, we break out of the loop
+   break;
+   } catch (InterruptedException e) {
+   // survive interruptions that 
arise from thread pool shutdown
+   // production code cannot 
actually throw InterruptedException from
+   // checkpoint acknowledgement
+   }
+   }
 
return null;
}



[flink] branch master updated: [FLINK-13514] Fix instability in StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge

2019-08-30 Thread aljoscha
This is an automated email from the ASF dual-hosted git repository.

aljoscha pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 60b65c4  [FLINK-13514] Fix instability in 
StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge
60b65c4 is described below

commit 60b65c42ca2709d20cb59f1617d96a80ec870b6c
Author: Aljoscha Krettek 
AuthorDate: Thu Aug 29 16:33:33 2019 +0200

[FLINK-13514] Fix instability in 
StreamTaskTest.testAsyncCheckpointingConcurrentCloseAfterAcknowledge

Before, thread pool shutdown would interrupt our waiting method.
Production code cannot throw an InterruptedException here and would also
not be correct if one is thrown.

We now swallow interrupted exceptions and wait until we successfully
return from await().
---
 .../flink/streaming/runtime/tasks/StreamTaskTest.java| 16 ++--
 1 file changed, 14 insertions(+), 2 deletions(-)

diff --git 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
index 946895a..ca6af8c 100644
--- 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
+++ 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/StreamTaskTest.java
@@ -488,11 +488,23 @@ public class StreamTaskTest extends TestLogger {
CheckpointResponder checkpointResponder = 
mock(CheckpointResponder.class);
doAnswer(new Answer() {
@Override
-   public Object answer(InvocationOnMock invocation) 
throws Throwable {
+   public Object answer(InvocationOnMock invocation) {
acknowledgeCheckpointLatch.trigger();
 
// block here so that we can issue the 
concurrent cancel call
-   completeAcknowledge.await();
+   while (true) {
+   try {
+   // wait until we successfully 
await (no pun intended)
+   completeAcknowledge.await();
+
+   // when await() returns 
normally, we break out of the loop
+   break;
+   } catch (InterruptedException e) {
+   // survive interruptions that 
arise from thread pool shutdown
+   // production code cannot 
actually throw InterruptedException from
+   // checkpoint acknowledgement
+   }
+   }
 
return null;
}



[flink] branch master updated (ccc7eb4 -> ce4bfae)

2019-08-30 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from ccc7eb4  [FLINK-13248] [datastream/streaming] Enabling custom 
factories for one input stream operators to be passed in DataStream
 add ce4bfae  [FLINK-9787] Let ExecutionConfig#getGlobalJobParameters 
always return a non null value

No new revisions were added by this update.

Summary of changes:
 .../java/org/apache/flink/api/common/ExecutionConfig.java| 12 +++-
 .../org/apache/flink/api/common/ExecutionConfigTest.java |  7 +++
 2 files changed, 18 insertions(+), 1 deletion(-)



[flink] 01/02: [FLINK-13819][coordination] Introduce isRunning state for RpcEndpoint

2019-08-30 Thread azagrebin
This is an automated email from the ASF dual-hosted git repository.

azagrebin pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 1a837659b71dd02ef6775bd2a4de331aab3ddc2e
Author: Andrey Zagrebin 
AuthorDate: Wed Aug 21 11:19:27 2019 +0200

[FLINK-13819][coordination] Introduce isRunning state for RpcEndpoint

To better reflect the lifecycle of RpcEndpoint, we suggest to introduce its 
running state.
We can use the non-running state e.g. to make decision about how to react 
on API
calls if it is already known that the RpcEndpoint is terminating.
---
 .../org/apache/flink/runtime/rpc/RpcEndpoint.java  | 85 --
 .../flink/runtime/rpc/akka/AkkaRpcActor.java   |  4 +-
 .../apache/flink/runtime/rpc/RpcEndpointTest.java  | 72 ++
 3 files changed, 153 insertions(+), 8 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/rpc/RpcEndpoint.java 
b/flink-runtime/src/main/java/org/apache/flink/runtime/rpc/RpcEndpoint.java
index 5c14a54..7c7a4c1 100644
--- a/flink-runtime/src/main/java/org/apache/flink/runtime/rpc/RpcEndpoint.java
+++ b/flink-runtime/src/main/java/org/apache/flink/runtime/rpc/RpcEndpoint.java
@@ -56,6 +56,36 @@ import static 
org.apache.flink.util.Preconditions.checkNotNull;
  *
  * The RPC endpoint provides {@link #runAsync(Runnable)}, {@link 
#callAsync(Callable, Time)}
  * and the {@link #getMainThreadExecutor()} to execute code in the RPC 
endpoint's main thread.
+ *
+ * Lifecycle
+ *
+ * The RPC endpoint has the following stages:
+ * 
+ *
+ *The RPC endpoint is created in a non-running state and does not 
serve any RPC requests.
+ *
+ *
+ *Calling the {@link #start()} method triggers the start of the RPC 
endpoint and schedules overridable
+ *{@link #onStart()} method call to the main thread.
+ *
+ *
+ *When the start operation ends the RPC endpoint is moved to the 
running state
+ *and starts to serve and complete RPC requests.
+ *
+ *
+ *Calling the {@link #closeAsync()} method triggers the termination of 
the RPC endpoint and schedules overridable
+ *{@link #onStop()} method call to the main thread.
+ *
+ *
+ *When {@link #onStop()} method is called, it triggers an asynchronous 
stop operation.
+ *The RPC endpoint is not in the running state anymore but it 
continues to serve RPC requests.
+ *
+ *
+ *When the asynchronous stop operation ends, the RPC endpoint 
terminates completely
+ *and does not serve RPC requests anymore.
+ *
+ * 
+ * The running state can be queried in a RPC method handler or in the main 
thread by calling {@link #isRunning()} method.
  */
 public abstract class RpcEndpoint implements RpcGateway, AutoCloseableAsync {
 
@@ -80,6 +110,13 @@ public abstract class RpcEndpoint implements RpcGateway, 
AutoCloseableAsync {
private final MainThreadExecutor mainThreadExecutor;
 
/**
+* Indicates whether the RPC endpoint is started and not stopped or 
being stopped.
+*
+* IMPORTANT: the running state is not thread safe and can be used 
only in the main thread of the rpc endpoint.
+*/
+   private boolean isRunning;
+
+   /**
 * Initializes the RPC endpoint.
 *
 * @param rpcService The RPC server that dispatches calls to this RPC 
endpoint.
@@ -112,12 +149,22 @@ public abstract class RpcEndpoint implements RpcGateway, 
AutoCloseableAsync {
return endpointId;
}
 
+   /**
+* Returns whether the RPC endpoint is started and not stopped or being 
stopped.
+*
+* @return whether the RPC endpoint is started and not stopped or being 
stopped.
+*/
+   protected boolean isRunning() {
+   validateRunsInMainThread();
+   return isRunning;
+   }
+
// 

//  Start & shutdown & lifecycle callbacks
// 

 
/**
-* Starts the rpc endpoint. This tells the underlying rpc server that 
the rpc endpoint is ready
+* Triggers start of the rpc endpoint. This tells the underlying rpc 
server that the rpc endpoint is ready
 * to process remote procedure calls.
 *
 * @throws Exception indicating that something went wrong while 
starting the RPC endpoint
@@ -127,20 +174,33 @@ public abstract class RpcEndpoint implements RpcGateway, 
AutoCloseableAsync {
}
 
/**
-* User overridable callback.
+* Internal method which is called by the RpcService implementation to 
start the RpcEndpoint.
+*
+* @throws Exception indicating that the rpc endpoint could not be 
started. If an exception occurs,
+* then the rpc endpoint 

[flink] 02/02: [FLINK-13769][Coordination] Close RM connection in TaskExecutor.onStop and do not reconnect

2019-08-30 Thread azagrebin
This is an automated email from the ASF dual-hosted git repository.

azagrebin pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git

commit b682e9a316669f30fa4bfcaa32a8fa0d3ac1dc02
Author: Andrey Zagrebin 
AuthorDate: Mon Aug 19 16:20:39 2019 +0200

[FLINK-13769][Coordination] Close RM connection in TaskExecutor.onStop and 
do not reconnect

This prevents JM from acquiring slots which belong to the stopped TM.
---
 .../apache/flink/runtime/taskexecutor/TaskExecutor.java| 14 +++---
 1 file changed, 7 insertions(+), 7 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/taskexecutor/TaskExecutor.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/taskexecutor/TaskExecutor.java
index c8bcaf9..b1238ec 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/taskexecutor/TaskExecutor.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/taskexecutor/TaskExecutor.java
@@ -340,11 +340,10 @@ public class TaskExecutor extends RpcEndpoint implements 
TaskExecutorGateway {
 
Throwable jobManagerDisconnectThrowable = null;
 
-   if (resourceManagerConnection != null) {
-   resourceManagerConnection.close();
-   }
-
FlinkException cause = new FlinkException("The TaskExecutor is 
shutting down.");
+
+   closeResourceManagerConnection(cause);
+
for (JobManagerConnection jobManagerConnection : 
jobManagerConnections.values()) {
try {

disassociateFromJobManager(jobManagerConnection, cause);
@@ -958,7 +957,9 @@ public class TaskExecutor extends RpcEndpoint implements 
TaskExecutorGateway {
 
@Override
public void disconnectResourceManager(Exception cause) {
-   reconnectToResourceManager(cause);
+   if (isRunning()) {
+   reconnectToResourceManager(cause);
+   }
}
 
// 
==
@@ -986,6 +987,7 @@ public class TaskExecutor extends RpcEndpoint implements 
TaskExecutorGateway {
 
private void reconnectToResourceManager(Exception cause) {
closeResourceManagerConnection(cause);
+   startRegistrationTimeout();
tryConnectToResourceManager();
}
 
@@ -1098,8 +1100,6 @@ public class TaskExecutor extends RpcEndpoint implements 
TaskExecutorGateway {
resourceManagerConnection.close();
resourceManagerConnection = null;
}
-
-   startRegistrationTimeout();
}
 
private void startRegistrationTimeout() {



[flink] branch release-1.9 updated (c175cc4 -> b682e9a)

2019-08-30 Thread azagrebin
This is an automated email from the ASF dual-hosted git repository.

azagrebin pushed a change to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git.


from c175cc4  [hotfix][FLINK-13901][docs] Fix documentation links check 
errors
 new 1a83765  [FLINK-13819][coordination] Introduce isRunning state for 
RpcEndpoint
 new b682e9a  [FLINK-13769][Coordination] Close RM connection in 
TaskExecutor.onStop and do not reconnect

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../org/apache/flink/runtime/rpc/RpcEndpoint.java  | 85 --
 .../flink/runtime/rpc/akka/AkkaRpcActor.java   |  4 +-
 .../flink/runtime/taskexecutor/TaskExecutor.java   | 14 ++--
 .../apache/flink/runtime/rpc/RpcEndpointTest.java  | 72 ++
 4 files changed, 160 insertions(+), 15 deletions(-)



[flink] branch master updated (a40b31b -> ccc7eb4)

2019-08-30 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from a40b31b  [FLINK-13877][hive] Support Hive version 2.1.0 and 2.1.1
 add b291bd5  [hotfix] [runtime] Replacing mockito in AsyncWaitOperatorTest
 add 8805406  [FLINK-13248][runtime] Implement per operator priorities for 
mailbox actions and yieldToDownstream concept
 add 0d8670b  [FLINK-13248] [runtime] Adding and using 
YieldableOperatorFactory to pass MailboxExecutor view to operator
 add ccc7eb4  [FLINK-13248] [datastream/streaming] Enabling custom 
factories for one input stream operators to be passed in DataStream

No new revisions were added by this update.

Summary of changes:
 .../flink/state/api/output/BoundedStreamTask.java  |   7 +-
 .../flink/streaming/api/datastream/DataStream.java |  41 +++-
 ...put.java => OneInputStreamOperatorFactory.java} |  15 +-
 .../api/operators/StreamOperatorFactoryUtil.java   |  51 +
 ...dOneInput.java => YieldingOperatorFactory.java} |  14 +-
 .../streaming/runtime/tasks/OperatorChain.java |  33 ++-
 .../flink/streaming/runtime/tasks/StreamTask.java  |   6 +-
 .../streaming/runtime/tasks/mailbox/Mailbox.java   |  55 +
 .../runtime/tasks/mailbox/MailboxReceiver.java |   5 -
 .../runtime/tasks/mailbox/MailboxSender.java   |   9 +
 .../runtime/tasks/mailbox/TaskMailbox.java | 137 
 .../{MailboxImpl.java => TaskMailboxImpl.java} | 108 +++--
 .../tasks/mailbox/execution/MailboxExecutor.java   |  18 ++
 .../mailbox/execution/MailboxExecutorFactory.java  |  19 +-
 ...orServiceImpl.java => MailboxExecutorImpl.java} |  35 +--
 .../mailbox/execution/MailboxExecutorService.java  |  29 ---
 .../tasks/mailbox/execution/MailboxProcessor.java  |  57 +++--
 .../api/operators/async/AsyncWaitOperatorTest.java | 249 +++--
 ...ilboxImplTest.java => TaskMailboxImplTest.java} | 159 +
 ...eImplTest.java => MailboxExecutorImplTest.java} |  69 +++---
 ...ssorTest.java => TaskMailboxProcessorTest.java} |  19 +-
 .../util/AbstractStreamOperatorTestHarness.java|  97 ++--
 .../flink/streaming/util/MockStreamTask.java   |  22 +-
 .../util/OneInputStreamOperatorTestHarness.java|  59 -
 .../table/runtime/harness/HarnessTestBase.scala|   8 +-
 25 files changed, 807 insertions(+), 514 deletions(-)
 copy 
flink-streaming-java/src/main/java/org/apache/flink/streaming/api/operators/{BoundedOneInput.java
 => OneInputStreamOperatorFactory.java} (74%)
 mode change 100755 => 100644
 create mode 100644 
flink-streaming-java/src/main/java/org/apache/flink/streaming/api/operators/StreamOperatorFactoryUtil.java
 copy 
flink-streaming-java/src/main/java/org/apache/flink/streaming/api/operators/{BoundedOneInput.java
 => YieldingOperatorFactory.java} (68%)
 mode change 100755 => 100644
 create mode 100644 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/mailbox/TaskMailbox.java
 rename 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/mailbox/{MailboxImpl.java
 => TaskMailboxImpl.java} (65%)
 copy 
flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/failover/flip1/ResultPartitionAvailabilityChecker.java
 => 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/mailbox/execution/MailboxExecutorFactory.java
 (60%)
 rename 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/mailbox/execution/{MailboxExecutorServiceImpl.java
 => MailboxExecutorImpl.java} (76%)
 delete mode 100644 
flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/mailbox/execution/MailboxExecutorService.java
 rename 
flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/mailbox/{MailboxImplTest.java
 => TaskMailboxImplTest.java} (52%)
 rename 
flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/mailbox/execution/{MailboxExecutorServiceImplTest.java
 => MailboxExecutorImplTest.java} (63%)
 rename 
flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/mailbox/execution/{MailboxProcessorTest.java
 => TaskMailboxProcessorTest.java} (91%)