[flink] branch release-1.11 updated: [FLINK-18349][docs] Add release notes for Flink 1.11

2020-06-25 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new 6ecf2d3  [FLINK-18349][docs] Add release notes for Flink 1.11
6ecf2d3 is described below

commit 6ecf2d309668bd790720a3cc12f065281f7b8411
Author: Piotr Nowojski 
AuthorDate: Wed Jun 17 19:20:20 2020 +0200

[FLINK-18349][docs] Add release notes for Flink 1.11
---
 docs/index.md   |   2 +-
 docs/index.zh.md|   2 +-
 docs/release-notes/flink-1.11.md| 287 
 docs/release-notes/flink-1.11.zh.md | 287 
 4 files changed, 576 insertions(+), 2 deletions(-)

diff --git a/docs/index.md b/docs/index.md
index afccb69..8ec9672 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -81,7 +81,7 @@ Before putting your Flink job into production, read the 
[Production Readiness Ch
 
 Release notes cover important changes between Flink versions. Please read them 
carefully if you plan to upgrade your Flink setup.
 
-See the release notes for [Flink 1.10]({% link release-notes/flink-1.10.md 
%}), [Flink 1.9]({% link release-notes/flink-1.9.md %}), [Flink 1.8]({% link 
release-notes/flink-1.8.md %}), or [Flink 1.7]({% link 
release-notes/flink-1.7.md %}).
+See the release notes for [Flink 1.11]({% link release-notes/flink-1.11.md 
%}), [Flink 1.10]({% link release-notes/flink-1.10.md %}), [Flink 1.9]({% link 
release-notes/flink-1.9.md %}), [Flink 1.8]({% link release-notes/flink-1.8.md 
%}), or [Flink 1.7]({% link release-notes/flink-1.7.md %}).
 
 
 
diff --git a/docs/index.zh.md b/docs/index.zh.md
index 0f83e89..13c2ef4 100644
--- a/docs/index.zh.md
+++ b/docs/index.zh.md
@@ -81,7 +81,7 @@ Apache Flink 是一个在无界和有界数据流上进行状态计算的框架
 
 release notes 包含了 Flink 版本之间的重大更新。请在你升级 Flink 之前仔细阅读相应的 release notes。
 
-请阅读 release notes [Flink 1.10]({% link release-notes/flink-1.10.zh.md %}), 
[Flink 1.9]({% link release-notes/flink-1.9.zh.md %}), [Flink 1.8]({% link 
release-notes/flink-1.8.zh.md %}), or [Flink 1.7]({% link 
release-notes/flink-1.7.zh.md %}).
+请阅读 release notes [Flink 1.11]({% link release-notes/flink-1.11.zh.md %}), 
[Flink 1.10]({% link release-notes/flink-1.10.zh.md %}), [Flink 1.9]({% link 
release-notes/flink-1.9.zh.md %}), [Flink 1.8]({% link 
release-notes/flink-1.8.zh.md %}), or [Flink 1.7]({% link 
release-notes/flink-1.7.zh.md %}).
 
 
 
diff --git a/docs/release-notes/flink-1.11.md b/docs/release-notes/flink-1.11.md
new file mode 100644
index 000..be18a0b
--- /dev/null
+++ b/docs/release-notes/flink-1.11.md
@@ -0,0 +1,287 @@
+---
+title: "Release Notes - Flink 1.11"
+---
+
+
+
+These release notes discuss important aspects, such as configuration, behavior,
+or dependencies, that changed between Flink 1.10 and Flink 1.11. Please read
+these notes carefully if you are planning to upgrade your Flink version to 
1.11.
+
+* This will be replaced by the TOC
+{:toc}
+
+### Clusters & Deployment
+ Support for Hadoop 3.0.0 and higher 
([FLINK-11086](https://issues.apache.org/jira/browse/FLINK-11086))
+Flink project does not provide any updated "flink-shaded-hadoop-*" jars.
+Users need to provide Hadoop dependencies through the HADOOP_CLASSPATH 
environment variable (recommended) or via `lib/` folder.
+Also, the `include-hadoop` Maven profile has been removed.
+
+ Removal of `LegacyScheduler` 
([FLINK-15629](https://issues.apache.org/jira/browse/FLINK-15629))
+Flink no longer supports the legacy scheduler. 
+Hence, setting `jobmanager.scheduler: legacy` will no longer work and fail 
with an `IllegalArgumentException`. 
+The only valid option for `jobmanager.scheduler` is the default value `ng`.
+
+ Bind user code class loader to lifetime of a slot 
([FLINK-16408](https://issues.apache.org/jira/browse/FLINK-16408))
+The user code class loader is being reused by the `TaskExecutor` as long as 
there is at least a single slot allocated for the respective job. 
+This changes Flink's recovery behaviour slightly so that it will not reload 
static fields.
+The benefit is that this change drastically reduces pressure on the JVM's 
metaspace.
+
+ Replaced `slave` file name with `workers` 
([FLINK-18307](https://issues.apache.org/jira/browse/FLINK-18307))
+For Standalone Setups, the file with the worker nodes is no longer called 
`slaves` but `workers`.
+Previous setups that use the `start-cluster.sh` and `stop-cluster.sh` scripts 
need to rename that file.
+
+ Flink Docker Integration Improvements
+The examples of `Dockerfiles` and docker image `build.sh` scripts have been 
removed from [the Flink Github repository](https://github.com/apache/flink). 
The examples will no longer be maintained by community in the Flink Github 
repository, including the examples of integration with Bluemix. Therefore, the 
following modules have been deleted 

[flink] branch master updated (e633453 -> 6227fff)

2020-06-25 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from e633453  [FLINK-18349][docs] Add release notes for Flink 1.11
 add 6227fff  fixup! [FLINK-18349][docs] Add release notes for Flink 1.11

No new revisions were added by this update.

Summary of changes:
 docs/release-notes/flink-1.11.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink] branch master updated (6834ed1 -> e633453)

2020-06-25 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 6834ed1  [FLINK-18417][table] Support List as a conversion class for 
ARRAY
 add e633453  [FLINK-18349][docs] Add release notes for Flink 1.11

No new revisions were added by this update.

Summary of changes:
 docs/index.md   |   2 +-
 docs/index.zh.md|   2 +-
 docs/release-notes/flink-1.11.md| 287 
 docs/release-notes/flink-1.11.zh.md | 287 
 4 files changed, 576 insertions(+), 2 deletions(-)
 create mode 100644 docs/release-notes/flink-1.11.md
 create mode 100644 docs/release-notes/flink-1.11.zh.md



[flink] branch master updated (6834ed1 -> e633453)

2020-06-25 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 6834ed1  [FLINK-18417][table] Support List as a conversion class for 
ARRAY
 add e633453  [FLINK-18349][docs] Add release notes for Flink 1.11

No new revisions were added by this update.

Summary of changes:
 docs/index.md   |   2 +-
 docs/index.zh.md|   2 +-
 docs/release-notes/flink-1.11.md| 287 
 docs/release-notes/flink-1.11.zh.md | 287 
 4 files changed, 576 insertions(+), 2 deletions(-)
 create mode 100644 docs/release-notes/flink-1.11.md
 create mode 100644 docs/release-notes/flink-1.11.zh.md



[flink] branch master updated: [FLINK-18417][table] Support List as a conversion class for ARRAY

2020-06-25 Thread twalthr
This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 6834ed1  [FLINK-18417][table] Support List as a conversion class for 
ARRAY
6834ed1 is described below

commit 6834ed181aa9edda6b9c7fb61696de93170a88d1
Author: Timo Walther 
AuthorDate: Tue Jun 23 17:16:35 2020 +0200

[FLINK-18417][table] Support List as a conversion class for ARRAY

This closes #12765.
---
 docs/dev/table/types.md|  2 +
 docs/dev/table/types.zh.md |  2 +
 .../table/types/extraction/DataTypeExtractor.java  | 40 --
 .../flink/table/types/logical/ArrayType.java   |  4 +
 .../apache/flink/table/types/LogicalTypesTest.java |  6 +-
 .../types/extraction/DataTypeExtractorTest.java| 16 
 .../table/data/conversion/ArrayListConverter.java  | 86 ++
 .../data/conversion/ArrayObjectArrayConverter.java |  4 +-
 .../data/conversion/DataStructureConverters.java   |  8 ++
 .../table/data/DataStructureConvertersTest.java| 55 --
 10 files changed, 207 insertions(+), 16 deletions(-)

diff --git a/docs/dev/table/types.md b/docs/dev/table/types.md
index 16991d6..698deca 100644
--- a/docs/dev/table/types.md
+++ b/docs/dev/table/types.md
@@ -1007,6 +1007,8 @@ equivalent to `ARRAY`.
 | Java Type  | Input | Output | Remarks
   |
 
|:---|:-:|:--:|:--|
 |*t*`[]` | (X)   | (X)| Depends on the 
subtype. *Default* |
+| `java.util.List`| X | X  |
   |
+| *subclass* of `java.util.List`  | X ||
   |
 |`org.apache.flink.table.data.ArrayData` | X | X  | Internal data 
structure.  |
 
  `MAP`
diff --git a/docs/dev/table/types.zh.md b/docs/dev/table/types.zh.md
index 252ff02..582bcd4 100644
--- a/docs/dev/table/types.zh.md
+++ b/docs/dev/table/types.zh.md
@@ -925,6 +925,8 @@ DataTypes.ARRAY(t)
 | Java 类型 | 输入 | 输出 | 备注   |
 |:--|:-:|:--:|:--|
 |*t*`[]`| (X)   | (X)| 依赖于子类型。 *缺省* |
+|`java.util.List`| X   | X||
+| *subclass* of `java.util.List`| X ||  |
 
  `MAP`
 
diff --git 
a/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
 
b/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
index ff353fd..3b20183 100644
--- 
a/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
+++ 
b/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
@@ -314,15 +314,35 @@ public final class DataTypeExtractor {
return DataTypes.ARRAY(
extractDataTypeOrRaw(template, typeHierarchy, 
genericArray.getGenericComponentType()));
}
+
+   final Class clazz = toClass(type);
+   if (clazz == null) {
+   return null;
+   }
+
// for my.custom.Pojo[][]
-   else if (type instanceof Class) {
-   final Class clazz = (Class) type;
-   if (clazz.isArray()) {
-   return DataTypes.ARRAY(
-   extractDataTypeOrRaw(template, 
typeHierarchy, clazz.getComponentType()));
-   }
+   if (clazz.isArray()) {
+   return DataTypes.ARRAY(
+   extractDataTypeOrRaw(template, typeHierarchy, 
clazz.getComponentType()));
}
-   return null;
+
+   // for List
+   // we only allow List here (not a subclass) because we cannot 
guarantee more specific
+   // data structures after conversion
+   if (clazz != List.class) {
+   return null;
+   }
+   if (!(type instanceof ParameterizedType)) {
+   throw extractionError(
+   "The class '%s' needs generic parameters for an 
array type.",
+   List.class.getName());
+   }
+   final ParameterizedType parameterizedType = (ParameterizedType) 
type;
+   final DataType element = extractDataTypeOrRaw(
+   template,
+   typeHierarchy,
+   parameterizedType.getActualTypeArguments()[0]);
+   return 

[flink] branch master updated: [FLINK-18417][table] Support List as a conversion class for ARRAY

2020-06-25 Thread twalthr
This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 6834ed1  [FLINK-18417][table] Support List as a conversion class for 
ARRAY
6834ed1 is described below

commit 6834ed181aa9edda6b9c7fb61696de93170a88d1
Author: Timo Walther 
AuthorDate: Tue Jun 23 17:16:35 2020 +0200

[FLINK-18417][table] Support List as a conversion class for ARRAY

This closes #12765.
---
 docs/dev/table/types.md|  2 +
 docs/dev/table/types.zh.md |  2 +
 .../table/types/extraction/DataTypeExtractor.java  | 40 --
 .../flink/table/types/logical/ArrayType.java   |  4 +
 .../apache/flink/table/types/LogicalTypesTest.java |  6 +-
 .../types/extraction/DataTypeExtractorTest.java| 16 
 .../table/data/conversion/ArrayListConverter.java  | 86 ++
 .../data/conversion/ArrayObjectArrayConverter.java |  4 +-
 .../data/conversion/DataStructureConverters.java   |  8 ++
 .../table/data/DataStructureConvertersTest.java| 55 --
 10 files changed, 207 insertions(+), 16 deletions(-)

diff --git a/docs/dev/table/types.md b/docs/dev/table/types.md
index 16991d6..698deca 100644
--- a/docs/dev/table/types.md
+++ b/docs/dev/table/types.md
@@ -1007,6 +1007,8 @@ equivalent to `ARRAY`.
 | Java Type  | Input | Output | Remarks
   |
 
|:---|:-:|:--:|:--|
 |*t*`[]` | (X)   | (X)| Depends on the 
subtype. *Default* |
+| `java.util.List`| X | X  |
   |
+| *subclass* of `java.util.List`  | X ||
   |
 |`org.apache.flink.table.data.ArrayData` | X | X  | Internal data 
structure.  |
 
  `MAP`
diff --git a/docs/dev/table/types.zh.md b/docs/dev/table/types.zh.md
index 252ff02..582bcd4 100644
--- a/docs/dev/table/types.zh.md
+++ b/docs/dev/table/types.zh.md
@@ -925,6 +925,8 @@ DataTypes.ARRAY(t)
 | Java 类型 | 输入 | 输出 | 备注   |
 |:--|:-:|:--:|:--|
 |*t*`[]`| (X)   | (X)| 依赖于子类型。 *缺省* |
+|`java.util.List`| X   | X||
+| *subclass* of `java.util.List`| X ||  |
 
  `MAP`
 
diff --git 
a/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
 
b/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
index ff353fd..3b20183 100644
--- 
a/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
+++ 
b/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/DataTypeExtractor.java
@@ -314,15 +314,35 @@ public final class DataTypeExtractor {
return DataTypes.ARRAY(
extractDataTypeOrRaw(template, typeHierarchy, 
genericArray.getGenericComponentType()));
}
+
+   final Class clazz = toClass(type);
+   if (clazz == null) {
+   return null;
+   }
+
// for my.custom.Pojo[][]
-   else if (type instanceof Class) {
-   final Class clazz = (Class) type;
-   if (clazz.isArray()) {
-   return DataTypes.ARRAY(
-   extractDataTypeOrRaw(template, 
typeHierarchy, clazz.getComponentType()));
-   }
+   if (clazz.isArray()) {
+   return DataTypes.ARRAY(
+   extractDataTypeOrRaw(template, typeHierarchy, 
clazz.getComponentType()));
}
-   return null;
+
+   // for List
+   // we only allow List here (not a subclass) because we cannot 
guarantee more specific
+   // data structures after conversion
+   if (clazz != List.class) {
+   return null;
+   }
+   if (!(type instanceof ParameterizedType)) {
+   throw extractionError(
+   "The class '%s' needs generic parameters for an 
array type.",
+   List.class.getName());
+   }
+   final ParameterizedType parameterizedType = (ParameterizedType) 
type;
+   final DataType element = extractDataTypeOrRaw(
+   template,
+   typeHierarchy,
+   parameterizedType.getActualTypeArguments()[0]);
+   return 

[flink] branch master updated: [FLINK-17300] Log the lineage information between ExecutionAttemptID and AllocationID

2020-06-25 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 7e48549  [FLINK-17300] Log the lineage information between 
ExecutionAttemptID and AllocationID
7e48549 is described below

commit 7e48549b822bc54f8dc0a5e1f9cbb5f3156fda06
Author: Yangze Guo 
AuthorDate: Wed Apr 22 14:14:29 2020 +0800

[FLINK-17300] Log the lineage information between ExecutionAttemptID and 
AllocationID

This closes #11852.
---
 .../java/org/apache/flink/runtime/executiongraph/Execution.java | 6 ++
 .../java/org/apache/flink/runtime/taskexecutor/TaskExecutor.java| 3 ++-
 .../org/apache/flink/yarn/YARNSessionCapacitySchedulerITCase.java   | 2 +-
 3 files changed, 5 insertions(+), 6 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/Execution.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/Execution.java
index f6702f8..56415e0 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/Execution.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/Execution.java
@@ -728,10 +728,8 @@ public class Execution implements AccessExecution, 
Archiveable

[flink] branch release-1.10 updated: [FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in converters

2020-06-25 Thread twalthr
This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 23ad7e3  [FLINK-18168][table-runtime-blink] Fix array reuse for 
BinaryArrayData in converters
23ad7e3 is described below

commit 23ad7e33117b7c02b28fda77596b12668b5117c1
Author: zoudan 
AuthorDate: Tue Jun 9 16:12:11 2020 +0800

[FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in 
converters

This closes #12542.
---
 .../apache/flink/table/dataformat/DataFormatConverters.java  |  2 +-
 .../flink/table/dataformat/DataFormatConvertersTest.java | 12 +++-
 2 files changed, 12 insertions(+), 2 deletions(-)

diff --git 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/dataformat/DataFormatConverters.java
 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/dataformat/DataFormatConverters.java
index 5ed8aa4..9d9fcca 100644
--- 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/dataformat/DataFormatConverters.java
+++ 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/dataformat/DataFormatConverters.java
@@ -,7 +,7 @@ public class DataFormatConverters {
}
}
reuseWriter.complete();
-   return reuseArray;
+   return reuseArray.copy();
}
 
@Override
diff --git 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/dataformat/DataFormatConvertersTest.java
 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/dataformat/DataFormatConvertersTest.java
index 2ca2c87..079e32e 100644
--- 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/dataformat/DataFormatConvertersTest.java
+++ 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/dataformat/DataFormatConvertersTest.java
@@ -146,9 +146,18 @@ public class DataFormatConvertersTest {
}
 
private static void test(TypeInformation typeInfo, Object value) {
+   test(typeInfo, value, null);
+   }
+
+   private static void test(TypeInformation typeInfo, Object value, Object 
anotherValue) {
DataFormatConverter converter = getConverter(typeInfo);
+   final Object innerValue = converter.toInternal(value);
+   if (anotherValue != null) {
+   converter.toInternal(anotherValue);
+   }
+
Assert.assertTrue(Arrays.deepEquals(
-   new Object[] 
{converter.toExternal(converter.toInternal(value))}, new Object[] {value}));
+   new Object[] {converter.toExternal(innerValue)}, new 
Object[]{value}));
}
 
private static DataFormatConverter getConverter(DataType dataType) {
@@ -191,6 +200,7 @@ public class DataFormatConvertersTest {
test(BasicArrayTypeInfo.DOUBLE_ARRAY_TYPE_INFO, new Double[] 
{null, null});
test(ObjectArrayTypeInfo.getInfoFor(Types.STRING), new String[] 
{null, null});
test(ObjectArrayTypeInfo.getInfoFor(Types.STRING), new String[] 
{"haha", "hehe"});
+   test(ObjectArrayTypeInfo.getInfoFor(Types.STRING), new String[] 
{"haha", "hehe"}, new String[] {"aa", "bb"});
test(new MapTypeInfo<>(Types.STRING, Types.INT), null);
 
HashMap map = new HashMap<>();



[flink] branch release-1.11 updated: [FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in converters

2020-06-25 Thread twalthr
This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new f5ac8c3  [FLINK-18168][table-runtime-blink] Fix array reuse for 
BinaryArrayData in converters
f5ac8c3 is described below

commit f5ac8c352b7fb5aff5a78cffa348f72bd8492509
Author: zoudan 
AuthorDate: Tue Jun 9 16:12:11 2020 +0800

[FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in 
converters

This closes #12542.
---
 .../data/conversion/ArrayObjectArrayConverter.java|  2 +-
 .../flink/table/data/util/DataFormatConverters.java   |  2 +-
 .../flink/table/data/DataFormatConvertersTest.java| 12 +++-
 .../flink/table/data/DataStructureConvertersTest.java | 19 +++
 4 files changed, 32 insertions(+), 3 deletions(-)

diff --git 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
index 5049064..c191c55 100644
--- 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
+++ 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
@@ -115,7 +115,7 @@ public class ArrayObjectArrayConverter implements 
DataStructureConverter(Types.STRING, Types.INT), null);
 
HashMap map = new HashMap<>();
diff --git 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
index 610d53c..29710fa 100644
--- 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
+++ 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
@@ -345,6 +345,12 @@ public class DataStructureConvertersTest {
null
)
})
+   .convertedToWithAnotherValue(
+   Row[].class,
+   new Row[] {
+   Row.of(null, null),
+   Row.of(new 
PojoWithImmutableFields(10, "Bob"), null)
+   })
);
}
 
@@ -369,6 +375,11 @@ public class DataStructureConvertersTest {
 
final Object internalValue = 
fromConverter.toInternalOrNull(from.getValue());
 
+   final Object anotherValue = 
testSpec.conversionsWithAnotherValue.get(from.getKey());
+   if (anotherValue != null) {
+   fromConverter.toInternalOrNull(anotherValue);
+   }
+
for (Map.Entry, Object> to : 
testSpec.conversions.entrySet()) {
final DataType toDataType = 
testSpec.dataType.bridgedTo(to.getKey());
 
@@ -395,12 +406,15 @@ public class DataStructureConvertersTest {
 
private final Map, Object> conversions;
 
+   private final Map, Object> conversionsWithAnotherValue;
+
private @Nullable String expectedErrorMessage;
 
private TestSpec(String description, DataType dataType) {
this.description = description;
this.dataType = dataType;
this.conversions = new LinkedHashMap<>();
+   this.conversionsWithAnotherValue = new 
LinkedHashMap<>();
}
 
static TestSpec forDataType(AbstractDataType dataType) {
@@ -420,6 +434,11 @@ public class DataStructureConvertersTest {
return this;
}
 
+TestSpec convertedToWithAnotherValue(Class clazz, T 
value) {
+   conversionsWithAnotherValue.put(clazz, value);
+   return this;
+   }
+
 TestSpec convertedToSupplier(Class clazz, Supplier 
supplier) {
conversions.put(clazz, supplier.get());
return this;



[flink] branch master updated: [FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in converters

2020-06-25 Thread twalthr
This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 581fabe  [FLINK-18168][table-runtime-blink] Fix array reuse for 
BinaryArrayData in converters
581fabe is described below

commit 581fabe7df12961d20753dff8947dec07cbb2c56
Author: zoudan 
AuthorDate: Tue Jun 9 16:12:11 2020 +0800

[FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in 
converters

This closes #12542.
---
 .../data/conversion/ArrayObjectArrayConverter.java|  2 +-
 .../flink/table/data/util/DataFormatConverters.java   |  2 +-
 .../flink/table/data/DataFormatConvertersTest.java| 12 +++-
 .../flink/table/data/DataStructureConvertersTest.java | 19 +++
 4 files changed, 32 insertions(+), 3 deletions(-)

diff --git 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
index 5049064..c191c55 100644
--- 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
+++ 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
@@ -115,7 +115,7 @@ public class ArrayObjectArrayConverter implements 
DataStructureConverter(Types.STRING, Types.INT), null);
 
HashMap map = new HashMap<>();
diff --git 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
index 610d53c..29710fa 100644
--- 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
+++ 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
@@ -345,6 +345,12 @@ public class DataStructureConvertersTest {
null
)
})
+   .convertedToWithAnotherValue(
+   Row[].class,
+   new Row[] {
+   Row.of(null, null),
+   Row.of(new 
PojoWithImmutableFields(10, "Bob"), null)
+   })
);
}
 
@@ -369,6 +375,11 @@ public class DataStructureConvertersTest {
 
final Object internalValue = 
fromConverter.toInternalOrNull(from.getValue());
 
+   final Object anotherValue = 
testSpec.conversionsWithAnotherValue.get(from.getKey());
+   if (anotherValue != null) {
+   fromConverter.toInternalOrNull(anotherValue);
+   }
+
for (Map.Entry, Object> to : 
testSpec.conversions.entrySet()) {
final DataType toDataType = 
testSpec.dataType.bridgedTo(to.getKey());
 
@@ -395,12 +406,15 @@ public class DataStructureConvertersTest {
 
private final Map, Object> conversions;
 
+   private final Map, Object> conversionsWithAnotherValue;
+
private @Nullable String expectedErrorMessage;
 
private TestSpec(String description, DataType dataType) {
this.description = description;
this.dataType = dataType;
this.conversions = new LinkedHashMap<>();
+   this.conversionsWithAnotherValue = new 
LinkedHashMap<>();
}
 
static TestSpec forDataType(AbstractDataType dataType) {
@@ -420,6 +434,11 @@ public class DataStructureConvertersTest {
return this;
}
 
+TestSpec convertedToWithAnotherValue(Class clazz, T 
value) {
+   conversionsWithAnotherValue.put(clazz, value);
+   return this;
+   }
+
 TestSpec convertedToSupplier(Class clazz, Supplier 
supplier) {
conversions.put(clazz, supplier.get());
return this;



[flink] branch master updated: [FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in converters

2020-06-25 Thread twalthr
This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 581fabe  [FLINK-18168][table-runtime-blink] Fix array reuse for 
BinaryArrayData in converters
581fabe is described below

commit 581fabe7df12961d20753dff8947dec07cbb2c56
Author: zoudan 
AuthorDate: Tue Jun 9 16:12:11 2020 +0800

[FLINK-18168][table-runtime-blink] Fix array reuse for BinaryArrayData in 
converters

This closes #12542.
---
 .../data/conversion/ArrayObjectArrayConverter.java|  2 +-
 .../flink/table/data/util/DataFormatConverters.java   |  2 +-
 .../flink/table/data/DataFormatConvertersTest.java| 12 +++-
 .../flink/table/data/DataStructureConvertersTest.java | 19 +++
 4 files changed, 32 insertions(+), 3 deletions(-)

diff --git 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
index 5049064..c191c55 100644
--- 
a/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
+++ 
b/flink-table/flink-table-runtime-blink/src/main/java/org/apache/flink/table/data/conversion/ArrayObjectArrayConverter.java
@@ -115,7 +115,7 @@ public class ArrayObjectArrayConverter implements 
DataStructureConverter(Types.STRING, Types.INT), null);
 
HashMap map = new HashMap<>();
diff --git 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
index 610d53c..29710fa 100644
--- 
a/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
+++ 
b/flink-table/flink-table-runtime-blink/src/test/java/org/apache/flink/table/data/DataStructureConvertersTest.java
@@ -345,6 +345,12 @@ public class DataStructureConvertersTest {
null
)
})
+   .convertedToWithAnotherValue(
+   Row[].class,
+   new Row[] {
+   Row.of(null, null),
+   Row.of(new 
PojoWithImmutableFields(10, "Bob"), null)
+   })
);
}
 
@@ -369,6 +375,11 @@ public class DataStructureConvertersTest {
 
final Object internalValue = 
fromConverter.toInternalOrNull(from.getValue());
 
+   final Object anotherValue = 
testSpec.conversionsWithAnotherValue.get(from.getKey());
+   if (anotherValue != null) {
+   fromConverter.toInternalOrNull(anotherValue);
+   }
+
for (Map.Entry, Object> to : 
testSpec.conversions.entrySet()) {
final DataType toDataType = 
testSpec.dataType.bridgedTo(to.getKey());
 
@@ -395,12 +406,15 @@ public class DataStructureConvertersTest {
 
private final Map, Object> conversions;
 
+   private final Map, Object> conversionsWithAnotherValue;
+
private @Nullable String expectedErrorMessage;
 
private TestSpec(String description, DataType dataType) {
this.description = description;
this.dataType = dataType;
this.conversions = new LinkedHashMap<>();
+   this.conversionsWithAnotherValue = new 
LinkedHashMap<>();
}
 
static TestSpec forDataType(AbstractDataType dataType) {
@@ -420,6 +434,11 @@ public class DataStructureConvertersTest {
return this;
}
 
+TestSpec convertedToWithAnotherValue(Class clazz, T 
value) {
+   conversionsWithAnotherValue.put(clazz, value);
+   return this;
+   }
+
 TestSpec convertedToSupplier(Class clazz, Supplier 
supplier) {
conversions.put(clazz, supplier.get());
return this;



[flink-web] branch asf-site updated (dc268f2 -> 355c2ab)

2020-06-25 Thread liyu
This is an automated email from the ASF dual-hosted git repository.

liyu pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from dc268f2  regenerate website
 new a5fea70  [blog] flink on zeppelin - part2
 new 355c2ab  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 _posts/2020-06-23-flink-on-zeppelin-part2.md   | 109 +++
 content/blog/feed.xml  | 350 +++--
 content/blog/index.html|  39 ++-
 content/blog/page10/index.html |  40 ++-
 content/blog/page11/index.html |  40 ++-
 content/blog/page12/index.html |  25 ++
 content/blog/page2/index.html  |  36 ++-
 content/blog/page3/index.html  |  36 ++-
 content/blog/page4/index.html  |  36 ++-
 content/blog/page5/index.html  |  38 ++-
 content/blog/page6/index.html  |  38 ++-
 content/blog/page7/index.html  |  38 ++-
 content/blog/page8/index.html  |  40 ++-
 content/blog/page9/index.html  |  40 ++-
 .../2020/06/23/flink-on-zeppelin-part2.html}   | 112 ---
 .../flink_append_mode.gif  | Bin 0 -> 294307 bytes
 .../flink_python_udf.png   | Bin 0 -> 83093 bytes
 .../flink_scala_udf.png| Bin 0 -> 84516 bytes
 .../flink_single_mode.gif  | Bin 0 -> 58198 bytes
 .../flink_update_mode.gif  | Bin 0 -> 131055 bytes
 content/index.html |   9 +-
 content/zh/index.html  |   9 +-
 .../flink_append_mode.gif  | Bin 0 -> 294307 bytes
 .../flink_python_udf.png   | Bin 0 -> 83093 bytes
 .../flink_scala_udf.png| Bin 0 -> 84516 bytes
 .../flink_single_mode.gif  | Bin 0 -> 58198 bytes
 .../flink_update_mode.gif  | Bin 0 -> 131055 bytes
 27 files changed, 586 insertions(+), 449 deletions(-)
 create mode 100644 _posts/2020-06-23-flink-on-zeppelin-part2.md
 copy content/{news/2020/06/15/flink-on-zeppelin-part1.html => 
ecosystem/2020/06/23/flink-on-zeppelin-part2.html} (69%)
 create mode 100644 
content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif
 create mode 100644 
content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png
 create mode 100644 
content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png
 create mode 100644 
content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif
 create mode 100644 
content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif
 create mode 100644 
img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif
 create mode 100644 
img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png
 create mode 100644 
img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png
 create mode 100644 
img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif
 create mode 100644 
img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif



[flink-web] 02/02: Rebuild website

2020-06-25 Thread liyu
This is an automated email from the ASF dual-hosted git repository.

liyu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 355c2ab49228658ad80cf5f1bc3e97cbac7cc479
Author: Yu Li 
AuthorDate: Thu Jun 25 14:15:43 2020 +0800

Rebuild website
---
 content/blog/feed.xml  | 350 +++-
 content/blog/index.html|  39 ++-
 content/blog/page10/index.html |  40 ++-
 content/blog/page11/index.html |  40 ++-
 content/blog/page12/index.html |  25 ++
 content/blog/page2/index.html  |  36 +-
 content/blog/page3/index.html  |  36 +-
 content/blog/page4/index.html  |  36 +-
 content/blog/page5/index.html  |  38 ++-
 content/blog/page6/index.html  |  38 ++-
 content/blog/page7/index.html  |  38 ++-
 content/blog/page8/index.html  |  40 ++-
 content/blog/page9/index.html  |  40 ++-
 .../2020/06/23/flink-on-zeppelin-part2.html| 365 +
 .../flink_append_mode.gif  | Bin 0 -> 294307 bytes
 .../flink_python_udf.png   | Bin 0 -> 83093 bytes
 .../flink_scala_udf.png| Bin 0 -> 84516 bytes
 .../flink_single_mode.gif  | Bin 0 -> 58198 bytes
 .../flink_update_mode.gif  | Bin 0 -> 131055 bytes
 content/index.html |   9 +-
 content/zh/index.html  |   9 +-
 21 files changed, 779 insertions(+), 400 deletions(-)

diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index eabf57c..e191286 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -7,6 +7,117 @@
 https://flink.apache.org/blog/feed.xml; rel="self" 
type="application/rss+xml" />
 
 
+Flink on Zeppelin Notebooks for Interactive Data Analysis - Part 
2
+pIn a previous post, we introduced the basics of Flink on 
Zeppelin and how to do Streaming ETL. In this second part of the “Flink on 
Zeppelin” series of posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin./p
+
+h1 id=streaming-data-visualizationStreaming Data 
Visualization/h1
+
+pWith a 
href=https://zeppelin.apache.org/Zeppelin/a;, you can 
build a real time streaming dashboard without writing any line of 
javascript/html/css code./p
+
+pOverall, Zeppelin supports 3 kinds of streaming data 
analytics:/p
+
+ul
+  liSingle Mode/li
+  liUpdate Mode/li
+  liAppend Mode/li
+/ul
+
+h3 id=single-modeSingle Mode/h3
+pSingle mode is used for cases when the result of a SQL statement is 
always one row, such as the following example. 
+The output format is translated in HTML, and you can specify a paragraph local 
property template for the final output content template. 
+And you can use code{i}/code as placeholder for the {i}th 
column of the result./p
+
+center
+img 
src=/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif
 width=80% alt=Single Mode /
+/center
+
+h3 id=update-modeUpdate Mode/h3
+pUpdate mode is suitable for the cases when the output format is more 
than one row, 
+and will always be continuously updated. Here’s one example where we use 
codeGROUP BY/code./p
+
+center
+img 
src=/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif
 width=80% alt=Update Mode /
+/center
+
+h3 id=append-modeAppend Mode/h3
+pAppend mode is suitable for the cases when the output data is always 
appended. 
+For instance, the example below uses a tumble window./p
+
+center
+img 
src=/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif
 width=80% alt=Append Mode /
+/center
+
+h1 id=udfUDF/h1
+
+pSQL is a very powerful language, especially in expressing data flow. 
But most of the time, you need to handle complicated business logic that cannot 
be expressed by SQL.
+In these cases UDFs (user-defined functions) come particularly handy. In 
Zeppelin, you can write Scala or Python UDFs, while you can also import Scala, 
Python and Java UDFs.
+Here are 2 examples of Scala and Python UDFs:/p
+
+ul
+  liScala UDF/li
+/ul
+
+div class=highlightprecode 
class=language-scalaspan 
class=o%/spanspan 
class=nflink/span
+
+span class=kclass/span span 
class=ncScalaUpper/span span 
class=kextends/span span 
class=ncScalarFunction/span span 
class=o{/span
+span class=kdef/span span 
class=neval/spanspan 
class=o(/spanspan 
class=nstr/spanspan 
class=k:/span span 
class=ktString/spanspan 
class=o)/span span 
class=k=/span span 
class=nstr/spanspan 
class=o./spanspan clas [...]
+span class=o}/span
+span class=nbtenv/spanspan 
class=o./spanspan 
class=nregisterFunction/spanspan 
class=o(/spanspan 
class=squot;scala_upperquot;/spanspan 
class=o,/span span 
class=knew/span span 

[flink-web] 01/02: [blog] flink on zeppelin - part2

2020-06-25 Thread liyu
This is an automated email from the ASF dual-hosted git repository.

liyu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit a5fea70a283d19e9c6573797ad2beab161da9f78
Author: Jeff Zhang 
AuthorDate: Tue Jun 2 12:50:20 2020 +0800

[blog] flink on zeppelin - part2

Co-authored-by: morsapaes 

This closes #344.
---
 _posts/2020-06-23-flink-on-zeppelin-part2.md   | 109 +
 .../flink_append_mode.gif  | Bin 0 -> 294307 bytes
 .../flink_python_udf.png   | Bin 0 -> 83093 bytes
 .../flink_scala_udf.png| Bin 0 -> 84516 bytes
 .../flink_single_mode.gif  | Bin 0 -> 58198 bytes
 .../flink_update_mode.gif  | Bin 0 -> 131055 bytes
 6 files changed, 109 insertions(+)

diff --git a/_posts/2020-06-23-flink-on-zeppelin-part2.md 
b/_posts/2020-06-23-flink-on-zeppelin-part2.md
new file mode 100644
index 000..782e74c
--- /dev/null
+++ b/_posts/2020-06-23-flink-on-zeppelin-part2.md
@@ -0,0 +1,109 @@
+---
+layout: post
+title:  "Flink on Zeppelin Notebooks for Interactive Data Analysis - Part 2"
+date:   2020-06-23T08:00:00.000Z
+categories: ecosystem
+authors:
+- zjffdu:
+  name: "Jeff Zhang"
+  twitter: "zjffdu"
+---
+
+In a previous post, we introduced the basics of Flink on Zeppelin and how to 
do Streaming ETL. In this second part of the "Flink on Zeppelin" series of 
posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin. 
+
+# Streaming Data Visualization
+
+With [Zeppelin](https://zeppelin.apache.org/), you can build a real time 
streaming dashboard without writing any line of javascript/html/css code.
+
+Overall, Zeppelin supports 3 kinds of streaming data analytics:
+
+* Single Mode
+* Update Mode
+* Append Mode
+
+### Single Mode
+Single mode is used for cases when the result of a SQL statement is always one 
row, such as the following example. 
+The output format is translated in HTML, and you can specify a paragraph local 
property template for the final output content template. 
+And you can use `{i}` as placeholder for the {i}th column of the result.
+
+
+
+
+
+### Update Mode
+Update mode is suitable for the cases when the output format is more than one 
row, 
+and will always be continuously updated. Here’s one example where we use 
``GROUP BY``.
+
+
+
+
+
+### Append Mode
+Append mode is suitable for the cases when the output data is always appended. 
+For instance, the example below uses a tumble window.
+
+
+
+
+
+# UDF
+
+SQL is a very powerful language, especially in expressing data flow. But most 
of the time, you need to handle complicated business logic that cannot be 
expressed by SQL.
+In these cases UDFs (user-defined functions) come particularly handy. In 
Zeppelin, you can write Scala or Python UDFs, while you can also import Scala, 
Python and Java UDFs.
+Here are 2 examples of Scala and Python UDFs:
+
+* Scala UDF
+
+```scala
+%flink
+
+class ScalaUpper extends ScalarFunction {
+def eval(str: String) = str.toUpperCase
+}
+btenv.registerFunction("scala_upper", new ScalaUpper())
+
+```
+ 
+* Python UDF
+
+```python
+
+%flink.pyflink
+
+class PythonUpper(ScalarFunction):
+def eval(self, s):
+ return s.upper()
+
+bt_env.register_function("python_upper", udf(PythonUpper(), 
DataTypes.STRING(), DataTypes.STRING()))
+
+```
+
+After you define the UDFs, you can use them directly in SQL:
+
+* Use Scala UDF in SQL
+
+
+
+
+
+* Use Python UDF in SQL
+
+
+
+
+
+# Summary
+
+In this post, we explained how to perform streaming data visualization via 
Flink on Zeppelin and how to use UDFs. 
+Besides that, you can do more in Zeppelin with Flink, such as batch 
processing, Hive integration and more.
+You can check the following articles for more details and here's a list of 
[Flink on Zeppelin tutorial 
videos](https://www.youtube.com/watch?v=YxPo0Fosjjg=PL4oy12nnS7FFtg3KV1iS5vDb0pTz12VcX)
 for your reference.
+
+# References
+
+* [Apache Zeppelin official website](http://zeppelin.apache.org)
+* Flink on Zeppelin tutorials - [Part 
1](https://medium.com/@zjffdu/flink-on-zeppelin-part-1-get-started-2591aaa6aa47)
+* Flink on Zeppelin tutorials - [Part 
2](https://medium.com/@zjffdu/flink-on-zeppelin-part-2-batch-711731df5ad9)
+* Flink on Zeppelin tutorials - [Part 
3](https://medium.com/@zjffdu/flink-on-zeppelin-part-3-streaming-5fca1e16754)
+* Flink on Zeppelin tutorials - [Part 
4](https://medium.com/@zjffdu/flink-on-zeppelin-part-4-advanced-usage-998b74908cd9)
+* [Flink on Zeppelin tutorial 
videos](https://www.youtube.com/watch?v=YxPo0Fosjjg=PL4oy12nnS7FFtg3KV1iS5vDb0pTz12VcX)
 
diff --git a/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif 
b/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif
new file mode 100644
index 000..3c827f4
Binary files /dev/null and