[GitHub] [flink] zhuzhurk merged pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


zhuzhurk merged pull request #13166:
URL: https://github.com/apache/flink/pull/13166


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] zhuzhurk commented on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


zhuzhurk commented on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674671199


   > > The change looks good to me. @KarmaGYZ
   > > But I think you will need to rebuild the web and make the changed web 
pages as part of this PR as well.
   > 
   > I think we do not need to rebuild the documentation web @zhuzhurk . This 
work will be done here https://ci.apache.org/builders/flink-docs-master.
   
   Sorry I made a mistake. Mixed it with flink-web.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13141: [FLINK-18852] Fix StreamScan doesn't inherit parallelism from input in legacy planner

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13141:
URL: https://github.com/apache/flink/pull/13141#issuecomment-673492124


   
   ## CI report:
   
   * 09f938c175da27a7e2a50418259cf006f4794db6 Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5534)
 
   * 0e9c4198095239ebee07a466a5e23de1a60809ac Azure: 
[PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5581)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] KarmaGYZ commented on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


KarmaGYZ commented on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674662368


   > The change looks good to me. @KarmaGYZ
   > But I think you will need to rebuild the web and make the changed web 
pages as part of this PR as well.
   
   I think we do not need to rebuild the documentation web. This work will be 
done here https://ci.apache.org/builders/flink-docs-master.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] KarmaGYZ edited a comment on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


KarmaGYZ edited a comment on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674662368


   > The change looks good to me. @KarmaGYZ
   > But I think you will need to rebuild the web and make the changed web 
pages as part of this PR as well.
   
   I think we do not need to rebuild the documentation web @zhuzhurk . This 
work will be done here https://ci.apache.org/builders/flink-docs-master.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Updated] (FLINK-17613) Run K8s related e2e tests with multiple K8s versions

2020-08-16 Thread Robert Metzger (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-17613?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Metzger updated FLINK-17613:
---
Labels: starter  (was: )

> Run K8s related e2e tests with multiple K8s versions
> 
>
> Key: FLINK-17613
> URL: https://issues.apache.org/jira/browse/FLINK-17613
> Project: Flink
>  Issue Type: Improvement
>  Components: Deployment / Kubernetes, Tests
>Reporter: Yang Wang
>Priority: Major
>  Labels: starter
>
> Follow the discussion in this PR [1].
> If we could run the K8s related e2e tests in multiple versions(the latest 
> version, oldest maintained version, etc.), it will help us to find the 
> usability issues earlier before users post them in the user ML[2]. 
> [1].https://github.com/apache/flink/pull/12071#discussion_r422838483
> [2].https://lists.apache.org/list.html?u...@flink.apache.org:lte=1M:Cannot%20start%20native%20K8s



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] hequn8128 commented on a change in pull request #13161: [FLINK-18886][python] Support Kafka connectors for Python DataStream …

2020-08-16 Thread GitBox


hequn8128 commented on a change in pull request #13161:
URL: https://github.com/apache/flink/pull/13161#discussion_r471206526



##
File path: flink-python/pyflink/datastream/tests/test_connectors.py
##
@@ -0,0 +1,97 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+from pyflink.common import typeinfo
+from pyflink.common.serialization_schemas import JsonRowDeserializationSchema, 
\
+JsonRowSerializationSchema
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import StreamExecutionEnvironment
+from pyflink.datastream.connectors import FlinkKafkaConsumer010, 
FlinkKafkaProducer010, \
+FlinkKafkaConsumer011, FlinkKafkaProducer011, FlinkKafkaConsumer, 
FlinkKafkaProducer
+from pyflink.java_gateway import get_gateway
+from pyflink.testing.test_case_utils import PyFlinkTestCase, 
_load_specific_flink_module_jars, \
+get_private_field, invoke_java_object_method
+
+
+class FlinkKafkaTest(PyFlinkTestCase):
+
+def setUp(self) -> None:
+self.env = StreamExecutionEnvironment.get_execution_environment()
+self._cxt_clz_loader = 
get_gateway().jvm.Thread.currentThread().getContextClassLoader()

Review comment:
   Add comments for the self._cxt_clz_loader during setUp and tearDown. 

##
File path: flink-python/pyflink/datastream/connectors.py
##
@@ -0,0 +1,546 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+from typing import Dict, List, Union
+
+from pyflink.common.serialization_schemas import DeserializationSchema, 
SerializationSchema
+from pyflink.datastream.functions import SourceFunction, SinkFunction
+from pyflink.java_gateway import get_gateway
+
+
+class FlinkKafkaConsumerBase(SourceFunction):
+"""
+Base class of all Flink Kafka Consumer data sources. This implements the 
common behavior across
+all kafka versions.
+
+The Kafka version specific behavior is defined mainly in the specific 
subclasses.
+"""
+
+def __init__(self, j_flink_kafka_consumer):
+super(FlinkKafkaConsumerBase, 
self).__init__(source_func=j_flink_kafka_consumer)
+
+def set_commit_offsets_on_checkpoints(self, commit_on_checkpoints: bool):
+"""
+Specifies whether or not the consumer should commit offsets back to 
kafka on checkpoints.
+This setting will only have effect if checkpointing is enabled for the 
job. If checkpointing
+isn't enabled, only the "auto.commit.enable" (for 0.8) / 
"enable.auto.commit" (for 0.9+)
+property settings will be used.
+"""
+self._j_function = self._j_function \
+.setCommitOffsetsOnCheckpoints(commit_on_checkpoints)
+return self
+
+def set_start_from_earliest(self):
+"""
+Specifies the consumer to start reading from the earliest offset for 
all partitions. This
+lets the consumer ignore any committed group offsets in Zookeeper/ 
Kafka brokers.
+
+This method does not affect where partitions are read from when the 
consumer is restored
+from a checkpoint or savepoint. When the consumer is restored from a 
checkpoint or
+savepoint, only the 

[GitHub] [flink] flinkbot edited a comment on pull request #13167: [FLINK-18212][table-planner-blink] Fix Lookup Join failed when there is a UDF equal condition on the column of temporal table

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13167:
URL: https://github.com/apache/flink/pull/13167#issuecomment-674654395


   
   ## CI report:
   
   * 0dcc8385ac8e114a23fefb452e6abd2cb094e65c Azure: 
[PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5580)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13141: [FLINK-18852] Fix StreamScan doesn't inherit parallelism from input in legacy planner

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13141:
URL: https://github.com/apache/flink/pull/13141#issuecomment-673492124


   
   ## CI report:
   
   * 09f938c175da27a7e2a50418259cf006f4794db6 Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5534)
 
   * 0e9c4198095239ebee07a466a5e23de1a60809ac UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot commented on pull request #13167: [FLINK-18212][table-planner-blink] Fix Lookup Join failed when there is a UDF equal condition on the column of temporal table

2020-08-16 Thread GitBox


flinkbot commented on pull request #13167:
URL: https://github.com/apache/flink/pull/13167#issuecomment-674654395


   
   ## CI report:
   
   * 0dcc8385ac8e114a23fefb452e6abd2cb094e65c UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot commented on pull request #13167: [FLINK-18212][table-planner-blink] Fix Lookup Join failed when there is a UDF equal condition on the column of temporal table

2020-08-16 Thread GitBox


flinkbot commented on pull request #13167:
URL: https://github.com/apache/flink/pull/13167#issuecomment-674648249


   Thanks a lot for your contribution to the Apache Flink project. I'm the 
@flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress 
of the review.
   
   
   ## Automated Checks
   Last check on commit 0dcc8385ac8e114a23fefb452e6abd2cb094e65c (Mon Aug 17 
04:26:45 UTC 2020)
   
   **Warnings:**
* No documentation files were touched! Remember to keep the Flink docs up 
to date!
   
   
   Mention the bot in a comment to re-run the automated checks.
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review 
Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full 
explanation of the review process.
The Bot is tracking the review progress through labels. Labels are applied 
according to the order of the review items. For consensus, approval by a Flink 
committer of PMC member is required Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot approve description` to approve one or more aspects (aspects: 
`description`, `consensus`, `architecture` and `quality`)
- `@flinkbot approve all` to approve all aspects
- `@flinkbot approve-until architecture` to approve everything until 
`architecture`
- `@flinkbot attention @username1 [@username2 ..]` to require somebody's 
attention
- `@flinkbot disapprove architecture` to remove an approval you gave earlier
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #11873: [FLINK-17295] Refactor the ExecutionAttemptID to consist of Execution…

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #11873:
URL: https://github.com/apache/flink/pull/11873#issuecomment-618208681


   
   ## CI report:
   
   * 04ebeee8b4a19b2aafee9e57a746522cd2c9654f Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4782)
 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4700)
 
   * ae7da66548039da54cf4ac664c517e2bb8eab68f Azure: 
[PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5579)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] wuchong commented on pull request #13167: [FLINK-18212][table-planner-blink] Fix Lookup Join failed when there is a UDF equal condition on the column of temporal table

2020-08-16 Thread GitBox


wuchong commented on pull request #13167:
URL: https://github.com/apache/flink/pull/13167#issuecomment-674647857


   cc @leonardBang , could you help to review this ?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Updated] (FLINK-18212) Init lookup join failed when use udf on lookup table

2020-08-16 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18212?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-18212:
---
Labels: pull-request-available  (was: )

> Init lookup join failed when use udf on lookup table
> 
>
> Key: FLINK-18212
> URL: https://issues.apache.org/jira/browse/FLINK-18212
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.1
>Reporter: YufeiLiu
>Assignee: Jark Wu
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> Throw exception 
> {code}
> Caused by: scala.MatchError: (CONCAT(_UTF-16LE'Hello', 
> $2),_UTF-16LE'Hello,Jark':VARCHAR(2147483647) CHARACTER SET "UTF-16LE") (of 
> class scala.Tuple2)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.org$apache$flink$table$planner$plan$nodes$common$CommonLookupJoin$$extractConstantField(CommonLookupJoin.scala:617)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.extractConstantFieldsFromEquiCondition(CommonLookupJoin.scala:607)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.analyzeLookupKeys(CommonLookupJoin.scala:567)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.(CommonLookupJoin.scala:129)
>   at 
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLookupJoin.(StreamExecLookupJoin.scala:49)
> {code}
> SQL:
> {code:sql}
> SELECT
>   T.id, T.len, T.content, D.name 
> FROM 
>   T JOIN userTable for system_time as of T.proctime AS D 
> ON T.id = D.id 
> WHERE 
>   add(T.id, D.id) > 3 AND add(T.id, 2) > 3 AND CONCAT('Hello', D.name) = 
> 'Hello,Jark'
> {code}
> When use function a RexCall can't match RexInputRef and cause this error, 
> myabe shoud add condition"{{case _ => return}}" to skip this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] wuchong opened a new pull request #13167: [FLINK-18212][table-planner-blink] Fix Lookup Join failed when there is a UDF equal condition on the column of temporal table

2020-08-16 Thread GitBox


wuchong opened a new pull request #13167:
URL: https://github.com/apache/flink/pull/13167


   
   
   
   
   ## What is the purpose of the change
   
   The following query is failed, because there is a UDF equal condition on the 
column of temporal table, and we don't take care this case when analysing 
constant equal condition. 
   
   ```
   SELECT
 T.id, T.len, T.content, D.name 
   FROM 
 T JOIN userTable for system_time as of T.proctime AS D 
   ON T.id = D.id 
   WHERE 
 add(T.id, D.id) > 3 AND add(T.id, 2) > 3 AND CONCAT('Hello', D.name) = 
'Hello,Jark'sql
   ```
   
   
   ## Brief change log
   
   - Fix this problem in `CommonLookupJoin`. 
   
   
   ## Verifying this change
   
   - Added an integration test and a plan test which can reproduce this problem.
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): (yes / **no**)
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: (yes / **no**)
 - The serializers: (yes / **no** / don't know)
 - The runtime per-record code paths (performance sensitive): (yes / **no** 
/ don't know)
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (yes / **no** / don't know)
 - The S3 file system connector: (yes / **no** / don't know)
   
   ## Documentation
   
 - Does this pull request introduce a new feature? (yes / **no**)
 - If yes, how is the feature documented? (**not applicable** / docs / 
JavaDocs / not documented)
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #12770: [FLINK-18200][python] Replace the deprecated interfaces with the new interfaces in the tests and examples

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #12770:
URL: https://github.com/apache/flink/pull/12770#issuecomment-649764859


   
   ## CI report:
   
   * e8821f7b8ba36c7b393039368796f484427d24d8 Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4058)
 
   * 240e0ba90bb9c41c794cf1ae2e15115a59ad5164 Azure: 
[PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5578)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (FLINK-18212) Init lookup join failed when use udf on lookup table

2020-08-16 Thread Zhu Zhu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18212?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178707#comment-17178707
 ] 

Zhu Zhu commented on FLINK-18212:
-

Thanks for the updates [~jark]

> Init lookup join failed when use udf on lookup table
> 
>
> Key: FLINK-18212
> URL: https://issues.apache.org/jira/browse/FLINK-18212
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.1
>Reporter: YufeiLiu
>Assignee: Jark Wu
>Priority: Major
>
> Throw exception 
> {code}
> Caused by: scala.MatchError: (CONCAT(_UTF-16LE'Hello', 
> $2),_UTF-16LE'Hello,Jark':VARCHAR(2147483647) CHARACTER SET "UTF-16LE") (of 
> class scala.Tuple2)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.org$apache$flink$table$planner$plan$nodes$common$CommonLookupJoin$$extractConstantField(CommonLookupJoin.scala:617)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.extractConstantFieldsFromEquiCondition(CommonLookupJoin.scala:607)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.analyzeLookupKeys(CommonLookupJoin.scala:567)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.(CommonLookupJoin.scala:129)
>   at 
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLookupJoin.(StreamExecLookupJoin.scala:49)
> {code}
> SQL:
> {code:sql}
> SELECT
>   T.id, T.len, T.content, D.name 
> FROM 
>   T JOIN userTable for system_time as of T.proctime AS D 
> ON T.id = D.id 
> WHERE 
>   add(T.id, D.id) > 3 AND add(T.id, 2) > 3 AND CONCAT('Hello', D.name) = 
> 'Hello,Jark'
> {code}
> When use function a RexCall can't match RexInputRef and cause this error, 
> myabe shoud add condition"{{case _ => return}}" to skip this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (FLINK-18212) Init lookup join failed when use udf on lookup table

2020-08-16 Thread Zhu Zhu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18212?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Zhu Zhu updated FLINK-18212:

Fix Version/s: 1.10.3
   1.11.2
   1.12.0

> Init lookup join failed when use udf on lookup table
> 
>
> Key: FLINK-18212
> URL: https://issues.apache.org/jira/browse/FLINK-18212
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.1
>Reporter: YufeiLiu
>Assignee: Jark Wu
>Priority: Major
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> Throw exception 
> {code}
> Caused by: scala.MatchError: (CONCAT(_UTF-16LE'Hello', 
> $2),_UTF-16LE'Hello,Jark':VARCHAR(2147483647) CHARACTER SET "UTF-16LE") (of 
> class scala.Tuple2)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.org$apache$flink$table$planner$plan$nodes$common$CommonLookupJoin$$extractConstantField(CommonLookupJoin.scala:617)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.extractConstantFieldsFromEquiCondition(CommonLookupJoin.scala:607)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.analyzeLookupKeys(CommonLookupJoin.scala:567)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.(CommonLookupJoin.scala:129)
>   at 
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLookupJoin.(StreamExecLookupJoin.scala:49)
> {code}
> SQL:
> {code:sql}
> SELECT
>   T.id, T.len, T.content, D.name 
> FROM 
>   T JOIN userTable for system_time as of T.proctime AS D 
> ON T.id = D.id 
> WHERE 
>   add(T.id, D.id) > 3 AND add(T.id, 2) > 3 AND CONCAT('Hello', D.name) = 
> 'Hello,Jark'
> {code}
> When use function a RexCall can't match RexInputRef and cause this error, 
> myabe shoud add condition"{{case _ => return}}" to skip this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #11873: [FLINK-17295] Refactor the ExecutionAttemptID to consist of Execution…

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #11873:
URL: https://github.com/apache/flink/pull/11873#issuecomment-618208681


   
   ## CI report:
   
   * 04ebeee8b4a19b2aafee9e57a746522cd2c9654f Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4782)
 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4700)
 
   * ae7da66548039da54cf4ac664c517e2bb8eab68f UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] zhuzhurk commented on a change in pull request #11873: [FLINK-17295] Refactor the ExecutionAttemptID to consist of Execution…

2020-08-16 Thread GitBox


zhuzhurk commented on a change in pull request #11873:
URL: https://github.com/apache/flink/pull/11873#discussion_r471221474



##
File path: 
flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/ExecutionAttemptID.java
##
@@ -18,33 +18,77 @@
 
 package org.apache.flink.runtime.executiongraph;
 
-import org.apache.flink.util.AbstractID;
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.runtime.scheduler.strategy.ExecutionVertexID;
 
 import org.apache.flink.shaded.netty4.io.netty.buffer.ByteBuf;
+import org.apache.flink.util.Preconditions;
 
 /**
  * Unique identifier for the attempt to execute a tasks. Multiple attempts 
happen
  * in cases of failures and recovery.
  */
-public class ExecutionAttemptID extends AbstractID {
+public class ExecutionAttemptID implements java.io.Serializable {
 
private static final long serialVersionUID = -1169683445778281344L;
 
+   private final ExecutionVertexID executionVertexID;
+   private final int attemptNumber;
+
+   /**
+* Get a random execution attempt id.
+*/
public ExecutionAttemptID() {

Review comment:
   Makes sense.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (FLINK-18212) Init lookup join failed when use udf on lookup table

2020-08-16 Thread Jark Wu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18212?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178705#comment-17178705
 ] 

Jark Wu commented on FLINK-18212:
-

This is a bug. I will fix this. 

> Init lookup join failed when use udf on lookup table
> 
>
> Key: FLINK-18212
> URL: https://issues.apache.org/jira/browse/FLINK-18212
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.1
>Reporter: YufeiLiu
>Priority: Major
>
> Throw exception 
> {code}
> Caused by: scala.MatchError: (CONCAT(_UTF-16LE'Hello', 
> $2),_UTF-16LE'Hello,Jark':VARCHAR(2147483647) CHARACTER SET "UTF-16LE") (of 
> class scala.Tuple2)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.org$apache$flink$table$planner$plan$nodes$common$CommonLookupJoin$$extractConstantField(CommonLookupJoin.scala:617)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.extractConstantFieldsFromEquiCondition(CommonLookupJoin.scala:607)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.analyzeLookupKeys(CommonLookupJoin.scala:567)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.(CommonLookupJoin.scala:129)
>   at 
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLookupJoin.(StreamExecLookupJoin.scala:49)
> {code}
> SQL:
> {code:sql}
> SELECT
>   T.id, T.len, T.content, D.name 
> FROM 
>   T JOIN userTable for system_time as of T.proctime AS D 
> ON T.id = D.id 
> WHERE 
>   add(T.id, D.id) > 3 AND add(T.id, 2) > 3 AND CONCAT('Hello', D.name) = 
> 'Hello,Jark'
> {code}
> When use function a RexCall can't match RexInputRef and cause this error, 
> myabe shoud add condition"{{case _ => return}}" to skip this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18212) Init lookup join failed when use udf on lookup table

2020-08-16 Thread Jark Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18212?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jark Wu reassigned FLINK-18212:
---

Assignee: Jark Wu

> Init lookup join failed when use udf on lookup table
> 
>
> Key: FLINK-18212
> URL: https://issues.apache.org/jira/browse/FLINK-18212
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.1
>Reporter: YufeiLiu
>Assignee: Jark Wu
>Priority: Major
>
> Throw exception 
> {code}
> Caused by: scala.MatchError: (CONCAT(_UTF-16LE'Hello', 
> $2),_UTF-16LE'Hello,Jark':VARCHAR(2147483647) CHARACTER SET "UTF-16LE") (of 
> class scala.Tuple2)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.org$apache$flink$table$planner$plan$nodes$common$CommonLookupJoin$$extractConstantField(CommonLookupJoin.scala:617)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.extractConstantFieldsFromEquiCondition(CommonLookupJoin.scala:607)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.analyzeLookupKeys(CommonLookupJoin.scala:567)
>   at 
> org.apache.flink.table.planner.plan.nodes.common.CommonLookupJoin.(CommonLookupJoin.scala:129)
>   at 
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLookupJoin.(StreamExecLookupJoin.scala:49)
> {code}
> SQL:
> {code:sql}
> SELECT
>   T.id, T.len, T.content, D.name 
> FROM 
>   T JOIN userTable for system_time as of T.proctime AS D 
> ON T.id = D.id 
> WHERE 
>   add(T.id, D.id) > 3 AND add(T.id, 2) > 3 AND CONCAT('Hello', D.name) = 
> 'Hello,Jark'
> {code}
> When use function a RexCall can't match RexInputRef and cause this error, 
> myabe shoud add condition"{{case _ => return}}" to skip this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674638589


   
   ## CI report:
   
   * 0a800fb84dd64e4d8619a5541328ecc975e1614b Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5577)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (FLINK-18899) The property yarn.application-attempts default value is not none

2020-08-16 Thread Yang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178702#comment-17178702
 ] 

Yang Wang commented on FLINK-18899:
---

Hi [~wooplevip], thanks for creating this ticket. IIUC, the reason why we leave 
the default value of {{yarn.application-attempts}} to none is to configure it 
based on the HA configuration. If the users do not specify a explicitly value 
of {{yarn.application-attempts}}, then
 * For non-HA mode, we should set the attempt to 1 to make the application 
fail-fast. Because when a new jobmanager is launched, it could not recover from 
the latest checkpoint.
 * For HA mode, we should respect the configuration of Yarn and relaunch the 
jobmanager more than once.

 

We could update the description to show the default behavior. Does it make 
sense to you?

> The property yarn.application-attempts default value is not none
> 
>
> Key: FLINK-18899
> URL: https://issues.apache.org/jira/browse/FLINK-18899
> Project: Flink
>  Issue Type: Bug
>  Components: Deployment / YARN, Documentation
>Affects Versions: 1.11.0
>Reporter: Peng
>Priority: Major
>
> The document 
> [https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/config.html#yarn-application-attempts]
>  shows the yarn.application-attempts default value is none, but I found codes 
> in  org.apache.flink.yarn.YarnClusterDescriptor class like below:
> {code:java}
> if (HighAvailabilityMode.isHighAvailabilityModeActivated(configuration)) {
>// activate re-execution of failed applications
>appContext.setMaxAppAttempts(
>  configuration.getInteger(
>YarnConfigOptions.APPLICATION_ATTEMPTS.key(),
>YarnConfiguration.DEFAULT_RM_AM_MAX_ATTEMPTS));
>activateHighAvailabilitySupport(appContext);
> } else {
>// set number of application retries to 1 in the default case
>appContext.setMaxAppAttempts(
>  configuration.getInteger(
>YarnConfigOptions.APPLICATION_ATTEMPTS.key(),
>1));
> }
> {code}
> This means, if HA mode, the default value is from Yarn(default 2), otherwise 
> the default value is 1.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #12770: [FLINK-18200][python] Replace the deprecated interfaces with the new interfaces in the tests and examples

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #12770:
URL: https://github.com/apache/flink/pull/12770#issuecomment-649764859


   
   ## CI report:
   
   * e8821f7b8ba36c7b393039368796f484427d24d8 Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4058)
 
   * 240e0ba90bb9c41c794cf1ae2e15115a59ad5164 UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] KarmaGYZ commented on a change in pull request #11873: [FLINK-17295] Refactor the ExecutionAttemptID to consist of Execution…

2020-08-16 Thread GitBox


KarmaGYZ commented on a change in pull request #11873:
URL: https://github.com/apache/flink/pull/11873#discussion_r471218269



##
File path: 
flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/ExecutionAttemptID.java
##
@@ -18,33 +18,77 @@
 
 package org.apache.flink.runtime.executiongraph;
 
-import org.apache.flink.util.AbstractID;
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.runtime.scheduler.strategy.ExecutionVertexID;
 
 import org.apache.flink.shaded.netty4.io.netty.buffer.ByteBuf;
+import org.apache.flink.util.Preconditions;
 
 /**
  * Unique identifier for the attempt to execute a tasks. Multiple attempts 
happen
  * in cases of failures and recovery.
  */
-public class ExecutionAttemptID extends AbstractID {
+public class ExecutionAttemptID implements java.io.Serializable {
 
private static final long serialVersionUID = -1169683445778281344L;
 
+   private final ExecutionVertexID executionVertexID;
+   private final int attemptNumber;
+
+   /**
+* Get a random execution attempt id.
+*/
public ExecutionAttemptID() {

Review comment:
   Yes, that's a good point. However, it is currently used by production 
code and a lot of testing code paths. I would move this issue out of the scope 
of this PR. WDYT?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot commented on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


flinkbot commented on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674638589


   
   ## CI report:
   
   * 0a800fb84dd64e4d8619a5541328ecc975e1614b UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] zhuzhurk commented on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


zhuzhurk commented on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674638099


   The change looks good to me. @KarmaGYZ 
   But I think you will need to rebuild the web and make the changed web pages 
as part of this PR as well.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] KarmaGYZ commented on a change in pull request #11873: [FLINK-17295] Refactor the ExecutionAttemptID to consist of Execution…

2020-08-16 Thread GitBox


KarmaGYZ commented on a change in pull request #11873:
URL: https://github.com/apache/flink/pull/11873#discussion_r471210685



##
File path: 
flink-runtime/src/main/java/org/apache/flink/runtime/executiongraph/ExecutionAttemptID.java
##
@@ -18,33 +18,77 @@
 
 package org.apache.flink.runtime.executiongraph;
 
-import org.apache.flink.util.AbstractID;
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.runtime.scheduler.strategy.ExecutionVertexID;
 
 import org.apache.flink.shaded.netty4.io.netty.buffer.ByteBuf;
+import org.apache.flink.util.Preconditions;
 
 /**
  * Unique identifier for the attempt to execute a tasks. Multiple attempts 
happen
  * in cases of failures and recovery.
  */
-public class ExecutionAttemptID extends AbstractID {
+public class ExecutionAttemptID implements java.io.Serializable {
 
private static final long serialVersionUID = -1169683445778281344L;
 
+   private final ExecutionVertexID executionVertexID;
+   private final int attemptNumber;
+
+   /**
+* Get a random execution attempt id.
+*/
public ExecutionAttemptID() {
+   this(new ExecutionVertexID(), 0);
}
 
-   public ExecutionAttemptID(long lowerPart, long upperPart) {
-   super(lowerPart, upperPart);
+   public ExecutionAttemptID(ExecutionVertexID executionVertexID, int 
attemptNumber) {
+   Preconditions.checkState(attemptNumber >= 0);
+   this.executionVertexID = 
Preconditions.checkNotNull(executionVertexID);
+   this.attemptNumber = attemptNumber;
}
 
public void writeTo(ByteBuf buf) {
-   buf.writeLong(this.lowerPart);
-   buf.writeLong(this.upperPart);
+   executionVertexID.writeTo(buf);
+   buf.writeInt(this.attemptNumber);
}
 
public static ExecutionAttemptID fromByteBuf(ByteBuf buf) {
-   long lower = buf.readLong();
-   long upper = buf.readLong();
-   return new ExecutionAttemptID(lower, upper);
+   final ExecutionVertexID executionVertexID = 
ExecutionVertexID.fromByteBuf(buf);
+   final int attemptNumber = buf.readInt();
+   return new ExecutionAttemptID(executionVertexID, attemptNumber);
+   }
+
+   @VisibleForTesting
+   public int getAttemptNumber() {
+   return attemptNumber;
+   }
+
+   @VisibleForTesting
+   public ExecutionVertexID getExecutionVertexID() {
+   return executionVertexID;
+   }
+
+   @Override
+   public boolean equals(Object obj) {
+   if (obj == this) {
+   return true;
+   } else if (obj != null && obj.getClass() == getClass()) {
+   ExecutionAttemptID that = (ExecutionAttemptID) obj;
+   return 
that.executionVertexID.equals(this.executionVertexID)
+   && that.attemptNumber == this.attemptNumber;
+   } else {
+   return false;
+   }
+   }
+
+   @Override
+   public int hashCode() {
+   return this.executionVertexID.hashCode() ^ this.attemptNumber;

Review comment:
   Good point!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot commented on pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


flinkbot commented on pull request #13166:
URL: https://github.com/apache/flink/pull/13166#issuecomment-674632668


   Thanks a lot for your contribution to the Apache Flink project. I'm the 
@flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress 
of the review.
   
   
   ## Automated Checks
   Last check on commit 0a800fb84dd64e4d8619a5541328ecc975e1614b (Mon Aug 17 
03:11:56 UTC 2020)
   
✅no warnings
   
   Mention the bot in a comment to re-run the automated checks.
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review 
Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full 
explanation of the review process.
The Bot is tracking the review progress through labels. Labels are applied 
according to the order of the review items. For consensus, approval by a Flink 
committer of PMC member is required Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot approve description` to approve one or more aspects (aspects: 
`description`, `consensus`, `architecture` and `quality`)
- `@flinkbot approve all` to approve all aspects
- `@flinkbot approve-until architecture` to approve everything until 
`architecture`
- `@flinkbot attention @username1 [@username2 ..]` to require somebody's 
attention
- `@flinkbot disapprove architecture` to remove an approval you gave earlier
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Updated] (FLINK-18081) Fix broken links in "Kerberos Authentication Setup and Configuration" doc

2020-08-16 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18081?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-18081:
---
Labels: pull-request-available  (was: )

> Fix broken links in "Kerberos Authentication Setup and Configuration" doc
> -
>
> Key: FLINK-18081
> URL: https://issues.apache.org/jira/browse/FLINK-18081
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation, Runtime / Configuration
>Affects Versions: 1.10.1, 1.11.0, 1.12.0
>Reporter: Yangze Guo
>Assignee: Yangze Guo
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> The {{config.html#kerberos-based-security}} is not valid now.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] KarmaGYZ opened a new pull request #13166: [FLINK-18081][doc] Fix broken links in Kerberos Authentication Setup …

2020-08-16 Thread GitBox


KarmaGYZ opened a new pull request #13166:
URL: https://github.com/apache/flink/pull/13166


   …and Configuration
   
   
   
   ## What is the purpose of the change
   
   Fix broken links in "Kerberos Authentication Setup and Configuration".
   
   
   
   ## Verifying this change
   
   This change is a trivial rework / code cleanup without any test coverage.
   
   cc @zhuzhurk 
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] liupc commented on pull request #13141: [FLINK-18852] Fix StreamScan doesn't inherit parallelism from input in legacy planner

2020-08-16 Thread GitBox


liupc commented on pull request #13141:
URL: https://github.com/apache/flink/pull/13141#issuecomment-674631652


   @wuchong Sorry, I check it again, it seems to be related with this 
parallelism changes, I will fix it!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (FLINK-17613) Run K8s related e2e tests with multiple K8s versions

2020-08-16 Thread Yang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178692#comment-17178692
 ] 

Yang Wang commented on FLINK-17613:
---

Hi [~rmetzger], i think this ticket could be marked with "starter" for some new 
contributors. And i am glad to help with review. Moreover, if no contributors 
are going to do this task, i could take over and finish it.

> Run K8s related e2e tests with multiple K8s versions
> 
>
> Key: FLINK-17613
> URL: https://issues.apache.org/jira/browse/FLINK-17613
> Project: Flink
>  Issue Type: Improvement
>  Components: Deployment / Kubernetes, Tests
>Reporter: Yang Wang
>Priority: Major
>
> Follow the discussion in this PR [1].
> If we could run the K8s related e2e tests in multiple versions(the latest 
> version, oldest maintained version, etc.), it will help us to find the 
> usability issues earlier before users post them in the user ML[2]. 
> [1].https://github.com/apache/flink/pull/12071#discussion_r422838483
> [2].https://lists.apache.org/list.html?u...@flink.apache.org:lte=1M:Cannot%20start%20native%20K8s



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] liupc commented on pull request #13141: [FLINK-18852] Fix StreamScan doesn't inherit parallelism from input in legacy planner

2020-08-16 Thread GitBox


liupc commented on pull request #13141:
URL: https://github.com/apache/flink/pull/13141#issuecomment-674630972


   > There are some tests failed. Please have a look.
   
   hi, @wuchong , I think the test failure has nothing to do with this PR.
   It reports the test failures in ExplainTest.
   
   
https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=5534=logs=b2f046ab-ae17-5406-acdc-240be7e870e4=93e5ae06-d194-513d-ba8d-150ef6da1d7c=7515
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Updated] (FLINK-18930) Translate "Hive Dialect" page of "Hive Integration" into Chinese

2020-08-16 Thread Jingsong Lee (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jingsong Lee updated FLINK-18930:
-
Fix Version/s: 1.12.0

> Translate "Hive Dialect" page of "Hive Integration" into Chinese
> 
>
> Key: FLINK-18930
> URL: https://issues.apache.org/jira/browse/FLINK-18930
> Project: Flink
>  Issue Type: Sub-task
>  Components: Connectors / Hive, Documentation
>Reporter: Rui Li
>Assignee: ZhuShang
>Priority: Major
> Fix For: 1.12.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18930) Translate "Hive Dialect" page of "Hive Integration" into Chinese

2020-08-16 Thread Jingsong Lee (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178689#comment-17178689
 ] 

Jingsong Lee commented on FLINK-18930:
--

[~ZhuShang] Welcome, assigned to you.

> Translate "Hive Dialect" page of "Hive Integration" into Chinese
> 
>
> Key: FLINK-18930
> URL: https://issues.apache.org/jira/browse/FLINK-18930
> Project: Flink
>  Issue Type: Sub-task
>  Components: Connectors / Hive, Documentation
>Reporter: Rui Li
>Assignee: ZhuShang
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18930) Translate "Hive Dialect" page of "Hive Integration" into Chinese

2020-08-16 Thread Jingsong Lee (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jingsong Lee reassigned FLINK-18930:


Assignee: ZhuShang

> Translate "Hive Dialect" page of "Hive Integration" into Chinese
> 
>
> Key: FLINK-18930
> URL: https://issues.apache.org/jira/browse/FLINK-18930
> Project: Flink
>  Issue Type: Sub-task
>  Components: Connectors / Hive, Documentation
>Reporter: Rui Li
>Assignee: ZhuShang
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #12962: [FLINK-18694] Add unaligned checkpoint config to web ui

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #12962:
URL: https://github.com/apache/flink/pull/12962#issuecomment-662526701


   
   ## CI report:
   
   * d2275584151f5a0a342af15e1d06f33da1237d62 UNKNOWN
   * d97679a726771744ab24dd4120ca3aae93d96c23 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5575)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Comment Edited] (FLINK-18081) Fix broken links in "Kerberos Authentication Setup and Configuration" doc

2020-08-16 Thread Zhu Zhu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178687#comment-17178687
 ] 

Zhu Zhu edited comment on FLINK-18081 at 8/17/20, 2:48 AM:
---

I have assigned the ticket to you [~karmagyz]


was (Author: zhuzh):
I have assigned the ticket to you.

> Fix broken links in "Kerberos Authentication Setup and Configuration" doc
> -
>
> Key: FLINK-18081
> URL: https://issues.apache.org/jira/browse/FLINK-18081
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation, Runtime / Configuration
>Affects Versions: 1.10.1, 1.11.0, 1.12.0
>Reporter: Yangze Guo
>Assignee: Yangze Guo
>Priority: Major
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> The {{config.html#kerberos-based-security}} is not valid now.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18081) Fix broken links in "Kerberos Authentication Setup and Configuration" doc

2020-08-16 Thread Zhu Zhu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178687#comment-17178687
 ] 

Zhu Zhu commented on FLINK-18081:
-

I have assigned the ticket to you.

> Fix broken links in "Kerberos Authentication Setup and Configuration" doc
> -
>
> Key: FLINK-18081
> URL: https://issues.apache.org/jira/browse/FLINK-18081
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation, Runtime / Configuration
>Affects Versions: 1.10.1, 1.11.0, 1.12.0
>Reporter: Yangze Guo
>Assignee: Yangze Guo
>Priority: Major
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> The {{config.html#kerberos-based-security}} is not valid now.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18081) Fix broken links in "Kerberos Authentication Setup and Configuration" doc

2020-08-16 Thread Zhu Zhu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18081?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Zhu Zhu reassigned FLINK-18081:
---

Assignee: Yangze Guo

> Fix broken links in "Kerberos Authentication Setup and Configuration" doc
> -
>
> Key: FLINK-18081
> URL: https://issues.apache.org/jira/browse/FLINK-18081
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation, Runtime / Configuration
>Affects Versions: 1.10.1, 1.11.0, 1.12.0
>Reporter: Yangze Guo
>Assignee: Yangze Guo
>Priority: Major
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> The {{config.html#kerberos-based-security}} is not valid now.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (FLINK-18969) Source code build error: Could not resolve dependencies for project org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT

2020-08-16 Thread Xin Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18969?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xin Wang updated FLINK-18969:
-
Environment: 
Java 8

mac os  10.14.6

branch: master

 

  was:
Java 8

mac os  10.14.6

 


> Source code build error:  Could not resolve dependencies for project 
> org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT
> ---
>
> Key: FLINK-18969
> URL: https://issues.apache.org/jira/browse/FLINK-18969
> Project: Flink
>  Issue Type: Bug
>  Components: FileSystems
>Affects Versions: 1.12.0
> Environment: Java 8
> mac os  10.14.6
> branch: master
>  
>Reporter: Xin Wang
>Priority: Major
>  Labels: compile
> Attachments: image-2020-08-17-09-55-19-914.png, 
> image-2020-08-17-09-55-51-714.png
>
>
> When I type cmmand:
>  
> {code:java}
> //代码占位符
> cd flink
> mvn clean install -Dmaven.test.skip=true   it occurs:
>     org.apache.flink
>   30         flink-parent
>   31         1.12-SNAPSHOT
> {code}
>  
>  
> !image-2020-08-17-09-55-19-914.png!
> !image-2020-08-17-09-55-51-714.png!
>  
> {code:java}
> //代码占位符
>  
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time:  03:33 min
> [INFO] Finished at: 2020-08-17T09:47:32+08:00
> [INFO] 
> 
> [ERROR] Failed to execute goal on project flink-oss-fs-hadoop: Could not 
> resolve dependencies for project 
> org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT: Could not find 
> artifact org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.12-SNAPSHOT -> 
> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the 
> command
> [ERROR]   mvn  -rf :flink-oss-fs-hadoop
> ZBMAC-C02WD3R01:flink wangxin813$ flink-oss-fs-hadoop{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (FLINK-18969) Source code build error: Could not resolve dependencies for project org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT

2020-08-16 Thread Xin Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18969?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xin Wang updated FLINK-18969:
-
Description: 
When I type cmmand:

 
{code:java}
//代码占位符
cd flink
mvn clean install -Dmaven.test.skip=true   it occurs:
    org.apache.flink
  30         flink-parent
  31         1.12-SNAPSHOT
{code}
 

 

!image-2020-08-17-09-55-19-914.png!

!image-2020-08-17-09-55-51-714.png!

 
{code:java}
//代码占位符

 
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time:  03:33 min
[INFO] Finished at: 2020-08-17T09:47:32+08:00
[INFO] 
[ERROR] Failed to execute goal on project flink-oss-fs-hadoop: Could not 
resolve dependencies for project 
org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT: Could not find artifact 
org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.12-SNAPSHOT -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn  -rf :flink-oss-fs-hadoop
ZBMAC-C02WD3R01:flink wangxin813$ flink-oss-fs-hadoop{code}

  was:
When I type cmmand:

 

cd flink

mvn clean install -Dmaven.test.skip=true   it occurs:

    org.apache.flink

  30         flink-parent

  31         1.12-SNAPSHOT

 

!image-2020-08-17-09-55-19-914.png!

!image-2020-08-17-09-55-51-714.png!

 

[INFO] 

[INFO] BUILD FAILURE

[INFO] 

[INFO] Total time:  03:33 min

[INFO] Finished at: 2020-08-17T09:47:32+08:00

[INFO] 

[ERROR] Failed to execute goal on project flink-oss-fs-hadoop: Could not 
resolve dependencies for project 
org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT: Could not find artifact 
org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.12-SNAPSHOT -> [Help 1]

[ERROR] 

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR] 

[ERROR] For more information about the errors and possible solutions, please 
read the following articles:

[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

[ERROR] 

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn  -rf :flink-oss-fs-hadoop

ZBMAC-C02WD3R01:flink wangxin813$ flink-oss-fs-hadoop


> Source code build error:  Could not resolve dependencies for project 
> org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT
> ---
>
> Key: FLINK-18969
> URL: https://issues.apache.org/jira/browse/FLINK-18969
> Project: Flink
>  Issue Type: Bug
>  Components: FileSystems
>Affects Versions: 1.12.0
> Environment: Java 8
> mac os  10.14.6
>  
>Reporter: Xin Wang
>Priority: Major
>  Labels: compile
> Attachments: image-2020-08-17-09-55-19-914.png, 
> image-2020-08-17-09-55-51-714.png
>
>
> When I type cmmand:
>  
> {code:java}
> //代码占位符
> cd flink
> mvn clean install -Dmaven.test.skip=true   it occurs:
>     org.apache.flink
>   30         flink-parent
>   31         1.12-SNAPSHOT
> {code}
>  
>  
> !image-2020-08-17-09-55-19-914.png!
> !image-2020-08-17-09-55-51-714.png!
>  
> {code:java}
> //代码占位符
>  
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time:  03:33 min
> [INFO] Finished at: 2020-08-17T09:47:32+08:00
> [INFO] 
> 
> [ERROR] Failed to execute goal on project flink-oss-fs-hadoop: Could not 
> resolve dependencies for project 
> org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT: Could not find 
> artifact org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.12-SNAPSHOT -> 
> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more 

[jira] [Commented] (FLINK-18930) Translate "Hive Dialect" page of "Hive Integration" into Chinese

2020-08-16 Thread ZhuShang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178686#comment-17178686
 ] 

ZhuShang commented on FLINK-18930:
--

Hi,[~lirui] 

i'm willing to do this translation,can you assign to me? thanks

> Translate "Hive Dialect" page of "Hive Integration" into Chinese
> 
>
> Key: FLINK-18930
> URL: https://issues.apache.org/jira/browse/FLINK-18930
> Project: Flink
>  Issue Type: Sub-task
>  Components: Connectors / Hive, Documentation
>Reporter: Rui Li
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #13147: [FLINK-18910][docs] Create the new document structure for Python documentation according to FLIP-133.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13147:
URL: https://github.com/apache/flink/pull/13147#issuecomment-673990635


   
   ## CI report:
   
   * 1e1a4bbc5dc7df0f7921b01a0fe36c9949818e76 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5576)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Assigned] (FLINK-18932) Add a "Overview" document under the "Python API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18932?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18932:
---

Assignee: Wei Zhong

> Add a "Overview" document under  the "Python API" section
> -
>
> Key: FLINK-18932
> URL: https://issues.apache.org/jira/browse/FLINK-18932
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18933) Delete the old Python Table API document section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18933?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18933:
---

Assignee: Wei Zhong

> Delete the old Python Table API document section
> 
>
> Key: FLINK-18933
> URL: https://issues.apache.org/jira/browse/FLINK-18933
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Python, Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18929) Add a "API Docs" link (linked to the generated sphinx docs) under the "Python API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18929:
---

Assignee: Huang Xingbo

> Add a "API Docs" link (linked to the generated sphinx docs) under the "Python 
> API" section
> --
>
> Key: FLINK-18929
> URL: https://issues.apache.org/jira/browse/FLINK-18929
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Huang Xingbo
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18926) Add a "Environment Variables" document under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18926:
---

Assignee: Wei Zhong

> Add a "Environment Variables" document under  the "Python API" -> "User 
> Guide" -> "Table API" section
> -
>
> Key: FLINK-18926
> URL: https://issues.apache.org/jira/browse/FLINK-18926
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18923) Add a "CEP" document under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18923?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18923:
---

Assignee: Dian Fu

> Add a "CEP" document under  the "Python API" -> "User Guide" -> "Table API" 
> section
> ---
>
> Key: FLINK-18923
> URL: https://issues.apache.org/jira/browse/FLINK-18923
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Dian Fu
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18927) Add a "Debugging" document under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18927:
---

Assignee: Huang Xingbo

> Add a "Debugging" document under  the "Python API" -> "User Guide" -> "Table 
> API" section
> -
>
> Key: FLINK-18927
> URL: https://issues.apache.org/jira/browse/FLINK-18927
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Huang Xingbo
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18921) Add a "SQL" link (linked to dev/table/sql/index.md) under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18921?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18921:
---

Assignee: Huang Xingbo

> Add a "SQL" link (linked to dev/table/sql/index.md) under the "Python API" -> 
> "User Guide" -> "Table API" section
> -
>
> Key: FLINK-18921
> URL: https://issues.apache.org/jira/browse/FLINK-18921
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Huang Xingbo
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18917) Add a "Built-in Functions" link (linked to dev/table/functions/systemFunctions.md) under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18917?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18917:
---

Assignee: Wei Zhong

> Add a "Built-in Functions" link (linked to 
> dev/table/functions/systemFunctions.md) under the "Python API" -> "User 
> Guide" -> "Table API" section
> 
>
> Key: FLINK-18917
> URL: https://issues.apache.org/jira/browse/FLINK-18917
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18916) Add a "Operations" link(linked to dev/table/tableApi.md) under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18916?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18916:
---

Assignee: Dian Fu

> Add a "Operations" link(linked to dev/table/tableApi.md) under the "Python 
> API" -> "User Guide" -> "Table API" section
> --
>
> Key: FLINK-18916
> URL: https://issues.apache.org/jira/browse/FLINK-18916
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Dian Fu
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18922) Add a "Catalogs" link (linked to dev/table/catalogs.md) under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18922?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18922:
---

Assignee: Dian Fu

> Add a "Catalogs" link (linked to dev/table/catalogs.md) under the "Python 
> API" -> "User Guide" -> "Table API" section
> -
>
> Key: FLINK-18922
> URL: https://issues.apache.org/jira/browse/FLINK-18922
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Dian Fu
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18801) Add a "10 minutes to Table API" document under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18801?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18801:
---

Assignee: Wei Zhong

> Add a "10 minutes to Table API" document under  the "Python API" -> "User 
> Guide" -> "Table API" section
> ---
>
> Key: FLINK-18801
> URL: https://issues.apache.org/jira/browse/FLINK-18801
> Project: Flink
>  Issue Type: Sub-task
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18937) Add a "Environment Setup" section to the "Installation" document

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18937?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18937:
---

Assignee: Huang Xingbo  (was: Dian Fu)

> Add a "Environment Setup" section to the "Installation" document
> 
>
> Key: FLINK-18937
> URL: https://issues.apache.org/jira/browse/FLINK-18937
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Python, Documentation
>Reporter: Wei Zhong
>Assignee: Huang Xingbo
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18913) Add a "TableEnvironment" document under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18913?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18913:
---

Assignee: Wei Zhong

> Add a "TableEnvironment" document under  the "Python API" -> "User Guide" -> 
> "Table API" section
> 
>
> Key: FLINK-18913
> URL: https://issues.apache.org/jira/browse/FLINK-18913
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18912) Add a Table API tutorial link(linked to try-flink/python_table_api.md) under the "Python API" -> "GettingStart" -> "Tutorial" section

2020-08-16 Thread Hequn Cheng (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18912?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hequn Cheng reassigned FLINK-18912:
---

Assignee: Hequn Cheng

> Add a Table API tutorial link(linked to try-flink/python_table_api.md) under  
> the "Python API" -> "GettingStart" -> "Tutorial" section
> --
>
> Key: FLINK-18912
> URL: https://issues.apache.org/jira/browse/FLINK-18912
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Hequn Cheng
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18928) Move the "Common Questions" document from the old Python Table API documentation to the "Python API" section with a new name "FAQ"

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18928:
---

Assignee: Wei Zhong

> Move the "Common Questions" document from the old Python Table API 
> documentation to the "Python API" section with a new name "FAQ"
> --
>
> Key: FLINK-18928
> URL: https://issues.apache.org/jira/browse/FLINK-18928
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18925) Move the "Configuration" document from the old Python Table API documentation to the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18925?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18925:
---

Assignee: Wei Zhong

> Move the "Configuration" document from the old Python Table API documentation 
> to the "Python API" -> "User Guide" -> "Table API" section
> 
>
> Key: FLINK-18925
> URL: https://issues.apache.org/jira/browse/FLINK-18925
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18918) Add a "Connectors" document under the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Hequn Cheng (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hequn Cheng reassigned FLINK-18918:
---

Assignee: Hequn Cheng

> Add a "Connectors" document under  the "Python API" -> "User Guide" -> "Table 
> API" section
> --
>
> Key: FLINK-18918
> URL: https://issues.apache.org/jira/browse/FLINK-18918
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Hequn Cheng
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18924) Move the "Metrics" document from the old Python Table API documentation to the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18924?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18924:
---

Assignee: Wei Zhong

> Move the "Metrics" document from the old Python Table API documentation to 
> the "Python API" -> "User Guide" -> "Table API" section
> --
>
> Key: FLINK-18924
> URL: https://issues.apache.org/jira/browse/FLINK-18924
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18937) Add a "Environment Setup" section to the "Installation" document

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18937?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18937:
---

Assignee: Dian Fu

> Add a "Environment Setup" section to the "Installation" document
> 
>
> Key: FLINK-18937
> URL: https://issues.apache.org/jira/browse/FLINK-18937
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Python, Documentation
>Reporter: Wei Zhong
>Assignee: Dian Fu
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18919) Move the "User Defined Functions" document and "Vectorized User Defined Functions" document from the old Python Table API documentation to the "Python API" -> "User Gui

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18919:
---

Assignee: Wei Zhong

> Move the "User Defined Functions" document and "Vectorized User Defined 
> Functions" document from the old Python Table API documentation to the 
> "Python API" -> "User Guide" -> "Table API" section
> --
>
> Key: FLINK-18919
> URL: https://issues.apache.org/jira/browse/FLINK-18919
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18920) Move the "Dependency Management" document from the old Python Table API documentation to the "Python API" -> "User Guide" -> "Table API" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18920?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18920:
---

Assignee: Wei Zhong

> Move the "Dependency Management" document from the old Python Table API 
> documentation to the "Python API" -> "User Guide" -> "Table API" section
> 
>
> Key: FLINK-18920
> URL: https://issues.apache.org/jira/browse/FLINK-18920
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18914) Move the "Python DataTypes" document from the old Python Table API documentation to the "Python API" -> "User Guide" -> "Table API" section with a new name "DataTypes"

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18914?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18914:
---

Assignee: Wei Zhong

> Move the "Python DataTypes" document from the old Python Table API 
> documentation to the "Python API" -> "User Guide" -> "Table API" section with 
> a new name "DataTypes"
> ---
>
> Key: FLINK-18914
> URL: https://issues.apache.org/jira/browse/FLINK-18914
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (FLINK-18911) Move the "Installation" document from the old Python Table API documentation to the "Python API" -> "GettingStart" section

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu reassigned FLINK-18911:
---

Assignee: Wei Zhong

> Move the "Installation" document from the old Python Table API documentation 
> to the "Python API" -> "GettingStart" section
> --
>
> Key: FLINK-18911
> URL: https://issues.apache.org/jira/browse/FLINK-18911
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (FLINK-18775) Rework PyFlink Documentation

2020-08-16 Thread Dian Fu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18775?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu updated FLINK-18775:

Fix Version/s: (was: 1.11.1)
   (was: 1.11.0)

> Rework PyFlink Documentation
> 
>
> Key: FLINK-18775
> URL: https://issues.apache.org/jira/browse/FLINK-18775
> Project: Flink
>  Issue Type: Improvement
>  Components: API / Python, Documentation
>Affects Versions: 1.11.0, 1.11.1
>Reporter: sunjincheng
>Assignee: Wei Zhong
>Priority: Major
>  Labels: beginner
> Fix For: 1.12.0, 1.11.2
>
>
> Since the release of Flink 1.11, users of PyFlink have continued to grow. 
> According to the feedback we received, current Flink documentation is not 
> very friendly to PyFlink users. There are two shortcomings:
>  # Python related content is mixed in the Java/Scala documentation, which 
> makes it difficult for users who only focus on PyFlink to read.
>  # There is already a "Python Table API" section in the Table API document to 
> store PyFlink documents, but the number of articles is small and the content 
> is fragmented. It is difficult for beginners to learn from it.
> In addition, 
> [FLIP-130|https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=158866298]
>  introduced the Python DataStream API. Many documents will be added for those 
> new APIs. In order to increase the readability and maintainability of the 
> PyFlink document, we would like to rework it via this umbrella JIRA.
>  
> The detail can be found in 
> [FLIP-133|https://cwiki.apache.org/confluence/display/FLINK/FLIP-133%3A+Rework+PyFlink+Documentation]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #13147: [FLINK-18910][docs] Create the new document structure for Python documentation according to FLIP-133.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13147:
URL: https://github.com/apache/flink/pull/13147#issuecomment-673990635


   
   ## CI report:
   
   * 568e67dff31b0cc5967bcb10c1bcb02f5ae90d28 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5531)
 
   * 1e1a4bbc5dc7df0f7921b01a0fe36c9949818e76 UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Updated] (FLINK-18959) Fail to archiveExecutionGraph because job is not finished when dispatcher close

2020-08-16 Thread Liu (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18959?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Liu updated FLINK-18959:

Description: 
When job is cancelled, we expect to see it in flink's history server. But I can 
not see my job after it is cancelled.

After digging into the problem, I find that the function archiveExecutionGraph 
is not executed. Below is the brief log:
{panel:title=log}
2020-08-14 15:10:06,406 INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph 
[flink-akka.actor.default-dispatcher- 15] - Job EtlAndWindow 
(6f784d4cc5bae88a332d254b21660372) switched from state RUNNING to CANCELLING.

2020-08-14 15:10:06,415 DEBUG 
org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-3] - Shutting down per-job cluster because 
the job was canceled.

2020-08-14 15:10:06,629 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-3] - Stopping dispatcher 
akka.tcp://flink@bjfk-c9865.yz02:38663/user/dispatcher.

2020-08-14 15:10:06,629 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-3] - Stopping all currently running jobs 
of dispatcher akka.tcp://flink@bjfk-c9865.yz02:38663/user/dispatcher.

2020-08-14 15:10:06,631 INFO org.apache.flink.runtime.jobmaster.JobMaster 
[flink-akka.actor.default-dispatcher-29] - Stopping the JobMaster for job 
EtlAndWindow(6f784d4cc5bae88a332d254b21660372).

2020-08-14 15:10:06,632 DEBUG org.apache.flink.runtime.jobmaster.JobMaster 
[flink-akka.actor.default-dispatcher-29] - Disconnect TaskExecutor 
container_e144_1590060720089_2161_01_06 because: Stopping JobMaster for job 
EtlAndWindow(6f784d4cc5bae88a332d254b21660372).

2020-08-14 15:10:06,646 INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph 
[flink-akka.actor.default-dispatcher-29] - Job EtlAndWindow 
(6f784d4cc5bae88a332d254b21660372) switched from state CANCELLING to CANCELED.

2020-08-14 15:10:06,664 DEBUG 
org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-4] - There is a newer JobManagerRunner for 
the job 6f784d4cc5bae88a332d254b21660372.
{panel}
>From the log, we can see that job is not finished when dispatcher closes. The 
>process is as following:
 * Receive cancel command and send it to all tasks async.
 * In MiniDispatcher, begin to shutting down per-job cluster.
 * Stopping dispatcher and remove job.
 * Job is cancelled and callback is executed in method startJobManagerRunner.
 * Because job is removed before, so currentJobManagerRunner is null which not 
equals to the original jobManagerRunner. In this case, archivedExecutionGraph 
will not be uploaded.

In normal cases, I find that job is cancelled first and then dispatcher is 
stopped so that archivedExecutionGraph will succeed. But I think that the order 
is not constrained and it is hard to know which comes first. 

Above is what I suspected. If so, then we should fix it.

 

  was:
When job is cancelled, we expect to see it in flink's history server. But I can 
not see my job after it is cancelled.

After digging into the problem, I find that the function archiveExecutionGraph 
is not executed. Below is the brief log:
{panel:title=log}
2020-08-14 15:10:06,406 INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph 
[flink-akka.actor.default-dispatcher- 15] - Job EtlAndWindow 
(6f784d4cc5bae88a332d254b21660372) switched from state RUNNING to CANCELLING.

2020-08-14 15:10:06,415 DEBUG 
org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-3] - Shutting down per-job cluster because 
the job was canceled.

2020-08-14 15:10:06,629 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-3] - Stopping dispatcher 
akka.tcp://flink@bjfk-c9865.yz02:38663/user/dispatcher.

2020-08-14 15:10:06,629 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-3] - Stopping all currently running jobs 
of dispatcher akka.tcp://flink@bjfk-c9865.yz02:38663/user/dispatcher.

2020-08-14 15:10:06,631 INFO org.apache.flink.runtime.jobmaster.JobMaster 
[flink-akka.actor.default-dispatcher-29] - Stopping the JobMaster for job 
EtlAndWindow(6f784d4cc5bae88a332d254b21660372).

2020-08-14 15:10:06,632 DEBUG org.apache.flink.runtime.jobmaster.JobMaster 
[flink-akka.actor.default-dispatcher-29] - Disconnect TaskExecutor 
container_e144_1590060720089_2161_01_06 because: Stopping JobMaster for job 
EtlAndWindow(6f784d4cc5bae88a332d254b21660372).

2020-08-14 15:10:06,646 INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph 
[flink-akka.actor.default-dispatcher-29] - Job EtlAndWindow 
(6f784d4cc5bae88a332d254b21660372) switched from state CANCELLING to CANCELED.

2020-08-14 15:10:06,664 DEBUG 
org.apache.flink.runtime.dispatcher.MiniDispatcher 
[flink-akka.actor.default-dispatcher-4] - There is a newer JobManagerRunner for 

[jira] [Created] (FLINK-18969) Source code build error: Could not resolve dependencies for project org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT

2020-08-16 Thread Xin Wang (Jira)
Xin Wang created FLINK-18969:


 Summary: Source code build error:  Could not resolve dependencies 
for project org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT
 Key: FLINK-18969
 URL: https://issues.apache.org/jira/browse/FLINK-18969
 Project: Flink
  Issue Type: Bug
  Components: FileSystems
Affects Versions: 1.12.0
 Environment: Java 8

mac os  10.14.6

 
Reporter: Xin Wang
 Attachments: image-2020-08-17-09-55-19-914.png, 
image-2020-08-17-09-55-51-714.png

When I type cmmand:

 

cd flink

mvn clean install -Dmaven.test.skip=true   it occurs:

    org.apache.flink

  30         flink-parent

  31         1.12-SNAPSHOT

 

!image-2020-08-17-09-55-19-914.png!

!image-2020-08-17-09-55-51-714.png!

 

[INFO] 

[INFO] BUILD FAILURE

[INFO] 

[INFO] Total time:  03:33 min

[INFO] Finished at: 2020-08-17T09:47:32+08:00

[INFO] 

[ERROR] Failed to execute goal on project flink-oss-fs-hadoop: Could not 
resolve dependencies for project 
org.apache.flink:flink-oss-fs-hadoop:jar:1.12-SNAPSHOT: Could not find artifact 
org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.12-SNAPSHOT -> [Help 1]

[ERROR] 

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR] 

[ERROR] For more information about the errors and possible solutions, please 
read the following articles:

[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

[ERROR] 

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn  -rf :flink-oss-fs-hadoop

ZBMAC-C02WD3R01:flink wangxin813$ flink-oss-fs-hadoop



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18081) Fix broken links in "Kerberos Authentication Setup and Configuration" doc

2020-08-16 Thread Yangze Guo (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178677#comment-17178677
 ] 

Yangze Guo commented on FLINK-18081:


Yes, please assign this to me. Thx.

> Fix broken links in "Kerberos Authentication Setup and Configuration" doc
> -
>
> Key: FLINK-18081
> URL: https://issues.apache.org/jira/browse/FLINK-18081
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation, Runtime / Configuration
>Affects Versions: 1.10.1, 1.11.0, 1.12.0
>Reporter: Yangze Guo
>Priority: Major
> Fix For: 1.12.0, 1.11.2, 1.10.3
>
>
> The {{config.html#kerberos-based-security}} is not valid now.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #13115: [FLINK-18219][runtime] Added OOM-enrichment for main method call in PackagedProgram.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13115:
URL: https://github.com/apache/flink/pull/13115#issuecomment-671776288


   
   ## CI report:
   
   * 3695a2619ae9f252fc1440d144559b14a597dee8 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5572)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Assigned] (FLINK-18910) Create the new document structure for Python documentation according to FLIP-133

2020-08-16 Thread sunjincheng (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sunjincheng reassigned FLINK-18910:
---

Assignee: Wei Zhong

> Create the new document structure for Python documentation according to 
> FLIP-133
> 
>
> Key: FLINK-18910
> URL: https://issues.apache.org/jira/browse/FLINK-18910
> Project: Flink
>  Issue Type: Sub-task
>  Components: Documentation
>Reporter: Wei Zhong
>Assignee: Wei Zhong
>Priority: Major
>  Labels: pull-request-available
>
> Create the following catalog structure under the "Application Development" 
> catalog:
> *Application Development*
>   *-* *Python API* 
>        *-* Getting Started
>        *-* User Guide
>           *-* Table API



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18965) ExecutionContextTest.testCatalogs failed with "ClassNotFoundException: org.apache.hadoop.fs.BlockStoragePolicySpi"

2020-08-16 Thread Dian Fu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178671#comment-17178671
 ] 

Dian Fu commented on FLINK-18965:
-

[https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=5570=logs=a8bc9173-2af6-5ba8-775c-12063b4f1d54=46a16c18-c679-5905-432b-9be5d8e27bc6]

> ExecutionContextTest.testCatalogs failed with "ClassNotFoundException: 
> org.apache.hadoop.fs.BlockStoragePolicySpi"
> --
>
> Key: FLINK-18965
> URL: https://issues.apache.org/jira/browse/FLINK-18965
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Hive
>Affects Versions: 1.12.0
>Reporter: Dian Fu
>Priority: Blocker
>  Labels: test-stability
> Fix For: 1.12.0
>
>
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=5541=logs=a8bc9173-2af6-5ba8-775c-12063b4f1d54=46a16c18-c679-5905-432b-9be5d8e27bc6]
> {code}
> 2020-08-14T21:10:09.3503802Z [ERROR] 
> testCatalogs(org.apache.flink.table.client.gateway.local.ExecutionContextTest)
>   Time elapsed: 0.148 s  <<< ERROR!
> 2020-08-14T21:10:09.3505006Z 
> org.apache.flink.table.client.gateway.SqlExecutionException: Could not create 
> execution context.
> 2020-08-14T21:10:09.3505856Z  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
> 2020-08-14T21:10:09.3506790Z  at 
> org.apache.flink.table.client.gateway.local.ExecutionContextTest.createExecutionContext(ExecutionContextTest.java:324)
> 2020-08-14T21:10:09.3508011Z  at 
> org.apache.flink.table.client.gateway.local.ExecutionContextTest.createCatalogExecutionContext(ExecutionContextTest.java:360)
> 2020-08-14T21:10:09.3509273Z  at 
> org.apache.flink.table.client.gateway.local.ExecutionContextTest.testCatalogs(ExecutionContextTest.java:133)
> 2020-08-14T21:10:09.3510548Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-08-14T21:10:09.3511496Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-08-14T21:10:09.3512417Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-08-14T21:10:09.3513883Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-08-14T21:10:09.3514563Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-08-14T21:10:09.3515604Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-08-14T21:10:09.3516643Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-08-14T21:10:09.3517498Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-08-14T21:10:09.3518189Z  at 
> org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> 2020-08-14T21:10:09.3519625Z  at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> 2020-08-14T21:10:09.3520621Z  at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> 2020-08-14T21:10:09.3521328Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-08-14T21:10:09.3521978Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-08-14T21:10:09.3522787Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-08-14T21:10:09.3523469Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-08-14T21:10:09.3524045Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-08-14T21:10:09.3524652Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-08-14T21:10:09.3525307Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-08-14T21:10:09.3526086Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-08-14T21:10:09.3526996Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-08-14T21:10:09.3527737Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-08-14T21:10:09.3528564Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-08-14T21:10:09.3529381Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-08-14T21:10:09.3530153Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-08-14T21:10:09.3530883Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-08-14T21:10:09.3531641Z Caused by: 
> 

[jira] [Commented] (FLINK-17274) Maven: Premature end of Content-Length delimited message body

2020-08-16 Thread Dian Fu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17274?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178670#comment-17178670
 ] 

Dian Fu commented on FLINK-17274:
-

[https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=5571=logs=d44f43ce-542c-597d-bf94-b0718c71e5e8=34f486e1-e1e4-5dd2-9c06-bfdd9b9c74a8]

> Maven: Premature end of Content-Length delimited message body
> -
>
> Key: FLINK-17274
> URL: https://issues.apache.org/jira/browse/FLINK-17274
> Project: Flink
>  Issue Type: Bug
>  Components: Build System / Azure Pipelines
>Reporter: Robert Metzger
>Assignee: Robert Metzger
>Priority: Critical
>  Labels: test-stability
> Fix For: 1.12.0
>
>
> CI: 
> https://dev.azure.com/rmetzger/Flink/_build/results?buildId=7786=logs=52b61abe-a3cc-5bde-cc35-1bbe89bb7df5=54421a62-0c80-5aad-3319-094ff69180bb
> {code}
> [ERROR] Failed to execute goal on project 
> flink-connector-elasticsearch7_2.11: Could not resolve dependencies for 
> project 
> org.apache.flink:flink-connector-elasticsearch7_2.11:jar:1.11-SNAPSHOT: Could 
> not transfer artifact org.apache.lucene:lucene-sandbox:jar:8.3.0 from/to 
> alicloud-mvn-mirror 
> (http://mavenmirror.alicloud.dak8s.net:/repository/maven-central/): GET 
> request of: org/apache/lucene/lucene-sandbox/8.3.0/lucene-sandbox-8.3.0.jar 
> from alicloud-mvn-mirror failed: Premature end of Content-Length delimited 
> message body (expected: 289920; received: 239832 -> [Help 1]
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-18695) Allow NettyBufferPool to allocate heap buffers

2020-08-16 Thread Yun Gao (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178666#comment-17178666
 ] 

Yun Gao commented on FLINK-18695:
-

I'm currently running tests with the current Netty version to compare  the 
consumption of the direct memory with and without the SSL layer, it should be 
able to indicate the overall memory consumption for SSL. The tests are still 
running, and I think I could finish them in the next two or three days. 

> Allow NettyBufferPool to allocate heap buffers
> --
>
> Key: FLINK-18695
> URL: https://issues.apache.org/jira/browse/FLINK-18695
> Project: Flink
>  Issue Type: Improvement
>  Components: Runtime / Network
>Reporter: Chesnay Schepler
>Assignee: Yun Gao
>Priority: Major
> Fix For: 1.12.0
>
>
> in 4.1.43 netty made a change to their SslHandler to always use heap buffers 
> for JDK SSLEngine implementations, to avoid an additional memory copy.
> However, our {{NettyBufferPool}} forbids heap buffer allocations.
> We will either have to allow heap buffer allocations, or create a custom 
> SslHandler implementation that does not use heap buffers (although this seems 
> ill-adviced?).
> /cc [~sewen] [~uce] [~NicoK] [~zjwang] [~pnowojski]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-11576) FLIP-33: Standardize connector metrics

2020-08-16 Thread Jiangjie Qin (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-11576?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178659#comment-17178659
 ] 

Jiangjie Qin commented on FLINK-11576:
--

[~trohrmann] No, it has not been abandoned. [~sewen] and I discussed about this 
last week and I am going to update the FLIP to include metrics used by the 
Source in FLIP-27.

> FLIP-33: Standardize connector metrics
> --
>
> Key: FLINK-11576
> URL: https://issues.apache.org/jira/browse/FLINK-11576
> Project: Flink
>  Issue Type: New Feature
>  Components: Connectors / Common
>Reporter: Jiangjie Qin
>Assignee: Jiangjie Qin
>Priority: Major
>
> This is a umbrella ticket for standardize connector metrics. Subtasks will be 
> created for each individual connector. The FLIP link is following:
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-33%3A+Standardize+Connector+Metrics



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] flinkbot edited a comment on pull request #12962: [FLINK-18694] Add unaligned checkpoint config to web ui

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #12962:
URL: https://github.com/apache/flink/pull/12962#issuecomment-662526701


   
   ## CI report:
   
   * d2275584151f5a0a342af15e1d06f33da1237d62 UNKNOWN
   * d49f2f7ffe75a2e39dc3b8925831c60b56683952 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5152)
 
   * d97679a726771744ab24dd4120ca3aae93d96c23 Azure: 
[PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5575)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #12962: [FLINK-18694] Add unaligned checkpoint config to web ui

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #12962:
URL: https://github.com/apache/flink/pull/12962#issuecomment-662526701


   
   ## CI report:
   
   * d2275584151f5a0a342af15e1d06f33da1237d62 UNKNOWN
   * d49f2f7ffe75a2e39dc3b8925831c60b56683952 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5152)
 
   * d97679a726771744ab24dd4120ca3aae93d96c23 UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #12823: FLINK-18013: Refactor Hadoop utils to a single module

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #12823:
URL: https://github.com/apache/flink/pull/12823#issuecomment-653912402


   
   ## CI report:
   
   * b35d83002812d5ae8da03cd4760d320c6cf92f82 Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5568)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13164: Flink 18946 cassandra

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13164:
URL: https://github.com/apache/flink/pull/13164#issuecomment-674561759


   
   ## CI report:
   
   * 6202eb4a68ca77f4ccd9d64dea947ba0c7f735e5 Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5567)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13165: [FLINK-18963][docs] Introduced IntelliJ subsection about adding a Copyright Profile for the Apache license.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13165:
URL: https://github.com/apache/flink/pull/13165#issuecomment-674581051


   
   ## CI report:
   
   * 0c04d957cb9a575618de52ddc94c290b2431e92a Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5574)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot commented on pull request #13165: [FLINK-18963][docs] Introduced IntelliJ subsection about adding a Copyright Profile for the Apache license.

2020-08-16 Thread GitBox


flinkbot commented on pull request #13165:
URL: https://github.com/apache/flink/pull/13165#issuecomment-674581051


   
   ## CI report:
   
   * 0c04d957cb9a575618de52ddc94c290b2431e92a UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13115: [FLINK-18219][runtime] Added OOM-enrichment for main method call in PackagedProgram.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13115:
URL: https://github.com/apache/flink/pull/13115#issuecomment-671776288


   
   ## CI report:
   
   * c7252f46c5704414a68f0ec9e1659da345afbd6c Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5569)
 
   * 3695a2619ae9f252fc1440d144559b14a597dee8 Azure: 
[PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5572)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (FLINK-18695) Allow NettyBufferPool to allocate heap buffers

2020-08-16 Thread Chesnay Schepler (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-18695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178632#comment-17178632
 ] 

Chesnay Schepler commented on FLINK-18695:
--

[~gaoyunhaii] Any insights so far?

> Allow NettyBufferPool to allocate heap buffers
> --
>
> Key: FLINK-18695
> URL: https://issues.apache.org/jira/browse/FLINK-18695
> Project: Flink
>  Issue Type: Improvement
>  Components: Runtime / Network
>Reporter: Chesnay Schepler
>Assignee: Yun Gao
>Priority: Major
> Fix For: 1.12.0
>
>
> in 4.1.43 netty made a change to their SslHandler to always use heap buffers 
> for JDK SSLEngine implementations, to avoid an additional memory copy.
> However, our {{NettyBufferPool}} forbids heap buffer allocations.
> We will either have to allow heap buffer allocations, or create a custom 
> SslHandler implementation that does not use heap buffers (although this seems 
> ill-adviced?).
> /cc [~sewen] [~uce] [~NicoK] [~zjwang] [~pnowojski]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-16917) "TPC-DS end-to-end test (Blink planner)" gets stuck

2020-08-16 Thread Chesnay Schepler (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16917?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17178631#comment-17178631
 ] 

Chesnay Schepler commented on FLINK-16917:
--

A fix for the orc issues has been merged to master for 1.12, along with an 
option to disable FLINK-16245 in case there are similar instances in the future:

6c130daaf59e77b343d1c947822ea0573738a204

fce82d7f56d5da3d3bf9ea6b66888d1350eb172f

> "TPC-DS end-to-end test (Blink planner)" gets stuck
> ---
>
> Key: FLINK-16917
> URL: https://issues.apache.org/jira/browse/FLINK-16917
> Project: Flink
>  Issue Type: Bug
>  Components: Runtime / Task, Tests
>Reporter: Robert Metzger
>Assignee: Arvid Heise
>Priority: Blocker
>  Labels: pull-request-available, test-stability
> Fix For: 1.11.0
>
> Attachments: Screenshot 2020-04-02 08.12.01.png, Screenshot 
> 2020-04-02 08.24.28.png, image-2020-04-02-09-32-52-979.png
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> The message you see from the CI system is
> {code}
> ##[error]The job running on agent Hosted Agent ran longer than the maximum 
> time of 240 minutes. For more information, see 
> https://go.microsoft.com/fwlink/?linkid=2077134
> {code}
> Example: 
> https://dev.azure.com/rmetzger/Flink/_build/results?buildId=6899=logs=c88eea3b-64a0-564d-0031-9fdcd7b8abee
> The end of the log file looks as follows:
> {code}
> 2020-03-31T23:00:40.5416207Z [INFO]Run TPC-DS query 97 success.
> 2020-03-31T23:00:40.5439265Z [INFO]Run TPC-DS query 98 ...
> 2020-03-31T23:00:40.8269500Z Job has been submitted with JobID 
> eec4759ae6d585ee9f8d9f84f1793c0e
> 2020-03-31T23:01:33.4757621Z Program execution finished
> 2020-03-31T23:01:33.4758328Z Job with JobID eec4759ae6d585ee9f8d9f84f1793c0e 
> has finished.
> 2020-03-31T23:01:33.4758880Z Job Runtime: 51093 ms
> 2020-03-31T23:01:33.4759057Z 
> 2020-03-31T23:01:33.4760999Z [INFO]Run TPC-DS query 98 success.
> 2020-03-31T23:01:33.4761612Z [INFO]Run TPC-DS query 99 ...
> 2020-03-31T23:01:33.7297686Z Job has been submitted with JobID 
> f47efc4194df2e0ead677fff239f3dfd
> 2020-03-31T23:01:50.0037484Z ##[error]The operation was canceled.
> 2020-03-31T23:01:50.0091655Z ##[section]Finishing: Run e2e tests
> {code}
> Notice the time difference between "Job has been submitted" and "The 
> operation was canceled.". There was nothing happening for 20 minutes.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (FLINK-16245) Use a delegating classloader as the user code classloader to prevent class leaks.

2020-08-16 Thread Chesnay Schepler (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-16245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chesnay Schepler closed FLINK-16245.

Resolution: Fixed

master:

eed1b58cd4e04f536c816e16bc82f2beed4a2862

30e82697a3f25a638462d1af9732b6c35132def2

e8cfc750300134bd1bc637fe6e0690c69cf4ecf6

> Use a delegating classloader as the user code classloader to prevent class 
> leaks.
> -
>
> Key: FLINK-16245
> URL: https://issues.apache.org/jira/browse/FLINK-16245
> Project: Flink
>  Issue Type: Improvement
>  Components: Runtime / Task
>Reporter: Stephan Ewen
>Assignee: Arvid Heise
>Priority: Critical
>  Labels: pull-request-available, usability
> Fix For: 1.12.0
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> As reported in FLINK-11205, a reference to the user-code ClassLoader can be 
> held by some libraries, causing class leaks.
> One way to circumvent this class leak is if the ClassLoader that we set as 
> the user-code ClassLoader is a delegating ClassLoader to the real class 
> loader, and when closing the user code ClassLoader we null out the reference.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink] zentol merged pull request #13027: [FLINK-16245] Decoupling user classloader from context classloader.

2020-08-16 Thread GitBox


zentol merged pull request #13027:
URL: https://github.com/apache/flink/pull/13027


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot commented on pull request #13165: [FLINK-18963][docs] Introduced IntelliJ subsection about adding a Copyright Profile for the Apache license.

2020-08-16 Thread GitBox


flinkbot commented on pull request #13165:
URL: https://github.com/apache/flink/pull/13165#issuecomment-674578819


   Thanks a lot for your contribution to the Apache Flink project. I'm the 
@flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress 
of the review.
   
   
   ## Automated Checks
   Last check on commit 0c04d957cb9a575618de52ddc94c290b2431e92a (Sun Aug 16 
21:14:30 UTC 2020)
   
✅no warnings
   
   Mention the bot in a comment to re-run the automated checks.
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review 
Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full 
explanation of the review process.
The Bot is tracking the review progress through labels. Labels are applied 
according to the order of the review items. For consensus, approval by a Flink 
committer of PMC member is required Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot approve description` to approve one or more aspects (aspects: 
`description`, `consensus`, `architecture` and `quality`)
- `@flinkbot approve all` to approve all aspects
- `@flinkbot approve-until architecture` to approve everything until 
`architecture`
- `@flinkbot attention @username1 [@username2 ..]` to require somebody's 
attention
- `@flinkbot disapprove architecture` to remove an approval you gave earlier
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] XComp opened a new pull request #13165: [FLINK-18963][docs] Introduced IntelliJ subsection about adding a Copyright Profile for the Apache license.

2020-08-16 Thread GitBox


XComp opened a new pull request #13165:
URL: https://github.com/apache/flink/pull/13165


   ## What is the purpose of the change
   
   Instruction were added on how to add the Copyright Profile for the Apache 
license to IntelliJ.
   
   ## Brief change log
   
   - Added a subsection to `docs/flinkDev/ide_setup.md` and 
`docs/flinkDev/idea_setup.zh.md`
   
   ## Verifying this change
   
   This change is a trivial rework / code cleanup without any test coverage.
   
   ## Does this pull request potentially affect one of the following parts:
   
 - Dependencies (does it add or upgrade a dependency): no
 - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: no
 - The serializers: no
 - The runtime per-record code paths (performance sensitive): no
 - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: no
 - The S3 file system connector: no
   
   ## Documentation
   
 - Does this pull request introduce a new feature? no
 - If yes, how is the feature documented? no
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Updated] (FLINK-18963) Added Copyright information to coding style guide

2020-08-16 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-18963?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-18963:
---
Labels: pull-request-available  (was: )

> Added Copyright information to coding style guide
> -
>
> Key: FLINK-18963
> URL: https://issues.apache.org/jira/browse/FLINK-18963
> Project: Flink
>  Issue Type: Improvement
>  Components: Project Website
>Reporter: Matthias
>Assignee: Matthias
>Priority: Minor
>  Labels: pull-request-available
>
> Add Copyright as a requirement to 
> [https://flink.apache.org/contributing/code-style-and-quality-common.html]
> Add Copyright profile instructions to ide_setup.md (including the Chinese 
> version).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [flink-web] XComp opened a new pull request #368: [FLINK-18963][site] Added Copyright information.

2020-08-16 Thread GitBox


XComp opened a new pull request #368:
URL: https://github.com/apache/flink-web/pull/368


   It was not stated anywhere, yet, that the copyright information is mandatory.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13163: [FLINK-16789][runtime] Support JMX RMI random port assign

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13163:
URL: https://github.com/apache/flink/pull/13163#issuecomment-674551539


   
   ## CI report:
   
   * b073c46b934a744fa6374759e8784b7ea27b4bce Azure: 
[SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5566)
 
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13115: [FLINK-18219][runtime] Added OOM-enrichment for main method call in PackagedProgram.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13115:
URL: https://github.com/apache/flink/pull/13115#issuecomment-671776288


   
   ## CI report:
   
   * c7252f46c5704414a68f0ec9e1659da345afbd6c Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5569)
 
   * 3695a2619ae9f252fc1440d144559b14a597dee8 UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flinkbot edited a comment on pull request #13115: [FLINK-18219][runtime] Added OOM-enrichment for main method call in PackagedProgram.

2020-08-16 Thread GitBox


flinkbot edited a comment on pull request #13115:
URL: https://github.com/apache/flink/pull/13115#issuecomment-671776288


   
   ## CI report:
   
   * 3a70e9874a3c3a84879251fb6402b8a329f3edf9 Azure: 
[FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5532)
 
   * c7252f46c5704414a68f0ec9e1659da345afbd6c UNKNOWN
   
   
   Bot commands
 The @flinkbot bot supports the following commands:
   
- `@flinkbot run travis` re-run the last Travis build
- `@flinkbot run azure` re-run the last Azure build
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flolas edited a comment on pull request #13128: [FLINK-18795][hbase] Support for HBase 2

2020-08-16 Thread GitBox


flolas edited a comment on pull request #13128:
URL: https://github.com/apache/flink/pull/13128#issuecomment-674567966


   This doesn't work when using YARN and Kerberos... Please see changes in 
https://github.com/apache/flink/pull/13047
   HBase changed the API for Token delegation since version 2.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] flolas commented on pull request #13128: [FLINK-18795][hbase] Support for HBase 2

2020-08-16 Thread GitBox


flolas commented on pull request #13128:
URL: https://github.com/apache/flink/pull/13128#issuecomment-674567966


   This doesn't work when using YARN and Kerberos... Please see 
https://github.com/apache/flink/pull/13047
   HBase changed the API for Token delegation since version 2.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   >