[GitHub] [flink] pnowojski commented on a change in pull request #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling
pnowojski commented on a change in pull request #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling URL: https://github.com/apache/flink/pull/11071#discussion_r378697278 ## File path: flink-streaming-java/src/main/java/org/apache/flink/streaming/api/functions/source/ContinuousFileReaderOperator.java ## @@ -449,7 +450,7 @@ private void cleanUp() { try { r.run(); } catch (Exception e) { - firstException = ExceptionUtils.firstOrSuppressed(firstException, e); + firstException = ExceptionUtils.firstOrSuppressed(e, firstException); Review comment: > the only difference will be in error message and stacktrace. Isn't that's the whole point of using `firstOrSuppressed` - to have the correct exception message? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] pnowojski commented on a change in pull request #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling
pnowojski commented on a change in pull request #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling URL: https://github.com/apache/flink/pull/11071#discussion_r378696814 ## File path: flink-streaming-java/src/main/java/org/apache/flink/streaming/api/functions/source/ContinuousFileReaderOperator.java ## @@ -380,10 +380,11 @@ public void processWatermark(Watermark mark) throws Exception { public void dispose() throws Exception { Exception e = null; if (state != ReaderState.CLOSED) { + state = ReaderState.CLOSED; Review comment: If it's not worth testing, then why do we need this code? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] pnowojski commented on a change in pull request #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling
pnowojski commented on a change in pull request #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling URL: https://github.com/apache/flink/pull/11071#discussion_r378696814 ## File path: flink-streaming-java/src/main/java/org/apache/flink/streaming/api/functions/source/ContinuousFileReaderOperator.java ## @@ -380,10 +380,11 @@ public void processWatermark(Watermark mark) throws Exception { public void dispose() throws Exception { Exception e = null; if (state != ReaderState.CLOSED) { + state = ReaderState.CLOSED; Review comment: If it's not worth testing, then do we need this code? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (FLINK-16025) Service could expose different blob server port mismatched with JM Container
[ https://issues.apache.org/jira/browse/FLINK-16025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035993#comment-17035993 ] Yang Wang commented on FLINK-16025: --- Nice catch. I think it is a bug. We should use {{KubernetesUtils.parsePort(flinkConfig, BlobServerOptions.PORT)}} instead of {{Constants.BLOB_SERVER_PORT}} in ServiceDecorator. > Service could expose different blob server port mismatched with JM Container > > > Key: FLINK-16025 > URL: https://issues.apache.org/jira/browse/FLINK-16025 > Project: Flink > Issue Type: Bug > Components: Deployment / Kubernetes >Affects Versions: 1.10.0 >Reporter: Canbin Zheng >Priority: Critical > Fix For: 1.10.1, 1.11.0 > > > The Service would always expose 6124 port if it should expose that port, and > while building ServicePort we do not explicitly specify a target port, so the > target port would always be 6124 too. > {code:java} > // From ServiceDecorator.java > servicePorts.add(getServicePort( > getPortName(BlobServerOptions.PORT.key()), > Constants.BLOB_SERVER_PORT)); > private ServicePort getServicePort(String name, int port) { >return new ServicePortBuilder() > .withName(name) > .withPort(port) > .build(); > } > {code} > > meanwhile, the Container of the JM would expose the blob server port which is > configured in the Flink Configuration, > {code:java} > // From FlinkMasterDeploymentDecorator.java > final int blobServerPort = KubernetesUtils.parsePort(flinkConfig, > BlobServerOptions.PORT); > ... > final Container container = createJobManagerContainer(flinkConfig, mainClass, > hasLogback, hasLog4j, blobServerPort); > {code} > > so there is a risk that in non-HA mode the TM could not execute Task due to > dependencies fetching failure if the Service exposes a blob server port which > is different from the JM Container when one configures the blob server port > with a value different from 6124. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #11079: [FLINK-16031][python] Improve the description in the README file of PyFlink 1.9.x.
flinkbot commented on issue #11079: [FLINK-16031][python] Improve the description in the README file of PyFlink 1.9.x. URL: https://github.com/apache/flink/pull/11079#issuecomment-585590555 ## CI report: * 2d3b4224a1c039ce6278c580d7bdee4b0f0a7825 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value
flinkbot edited a comment on issue #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value URL: https://github.com/apache/flink/pull/11059#issuecomment-584620649 ## CI report: * 9cd9bb39ff403454de2681ddb77fc5168c0df0cc Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148370358) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5053) * 66c8cda07553d5114d3b1450e8804ac927630c3d Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148726426) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5122) * 800be54a086522912bb4fb9ca00bfc9ab6c935bf UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
flinkbot edited a comment on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#issuecomment-583941299 ## CI report: * 507d7bb6076e4bc176c46f52e823762d264f3c28 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148130015) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4985) * e0be5057cfc067796c51c4917f5ca2c824ee7872 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148726376) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5120) * d35fcc7bee22f220f89314db15e1eda13f1bba02 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Closed] (FLINK-15978) Publish Docerfiles for release 1.10.0
[ https://issues.apache.org/jira/browse/FLINK-15978?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yu Li closed FLINK-15978. - Fix Version/s: (was: 1.10.1) 1.10.0 Resolution: Done PR for official-images also merged. Closing since all work done. > Publish Docerfiles for release 1.10.0 > - > > Key: FLINK-15978 > URL: https://issues.apache.org/jira/browse/FLINK-15978 > Project: Flink > Issue Type: Task > Components: Release System >Affects Versions: 1.10.0 >Reporter: Yu Li >Assignee: Yu Li >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Publish the Dockerfiles for 1.10.0 after the RC voting passed, to finalize > the release process as > [documented|https://cwiki.apache.org/confluence/display/FLINK/Creating+a+Flink+Release]. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-14881) Upgrade AWS SDK to support "IAM Roles for Service Accounts" in AWS EKS
[ https://issues.apache.org/jira/browse/FLINK-14881?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035982#comment-17035982 ] Rafi Aroch commented on FLINK-14881: Flink currently uses the Java SDK (Version 1) so according to the documentation, the minimum version is 1.11.625. > Upgrade AWS SDK to support "IAM Roles for Service Accounts" in AWS EKS > -- > > Key: FLINK-14881 > URL: https://issues.apache.org/jira/browse/FLINK-14881 > Project: Flink > Issue Type: Improvement > Components: FileSystems >Reporter: Vincent Chenal >Priority: Major > > In order to use IAM Roles for Service Accounts in AWS EKS, the minimum > required version of the AWS SDK is 1.11.623. > [https://docs.aws.amazon.com/eks/latest/userguide/iam-roles-for-service-accounts-minimum-sdk.html] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #11079: [FLINK-16031][python] Improve the description in the README file of PyFlink 1.9.x.
flinkbot commented on issue #11079: [FLINK-16031][python] Improve the description in the README file of PyFlink 1.9.x. URL: https://github.com/apache/flink/pull/11079#issuecomment-585585168 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit 2d3b4224a1c039ce6278c580d7bdee4b0f0a7825 (Thu Feb 13 07:12:47 UTC 2020) **Warnings:** * Documentation files were touched, but no `.zh.md` files: Update Chinese documentation or file Jira ticket. * **This pull request references an unassigned [Jira ticket](https://issues.apache.org/jira/browse/FLINK-16031).** According to the [code contribution guide](https://flink.apache.org/contributing/contribute-code.html), tickets need to be assigned before starting with the implementation work. Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] WeiZhong94 opened a new pull request #11079: [FLINK-16031][python] Improve the description in the README file of PyFlink 1.9.x.
WeiZhong94 opened a new pull request #11079: [FLINK-16031][python] Improve the description in the README file of PyFlink 1.9.x. URL: https://github.com/apache/flink/pull/11079 ## What is the purpose of the change *This pull request improves the description in the README file of PyFlink 1.9.x.* ## Brief change log - *Rewrite the content of `README.md`.* ## Verifying this change This change is a trivial rework / code cleanup without any test coverage. ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): (no) - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (no) - The serializers: (no) - The runtime per-record code paths (performance sensitive): (no) - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no) - The S3 file system connector: (no) ## Documentation - Does this pull request introduce a new feature? (no) - If yes, how is the feature documented? (not applicable) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-16031) Improve the description in the README file of PyFlink 1.9.x
[ https://issues.apache.org/jira/browse/FLINK-16031?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-16031: --- Labels: pull-request-available (was: ) > Improve the description in the README file of PyFlink 1.9.x > > > Key: FLINK-16031 > URL: https://issues.apache.org/jira/browse/FLINK-16031 > Project: Flink > Issue Type: Improvement > Components: API / Python >Affects Versions: 1.9.1 >Reporter: Wei Zhong >Priority: Major > Labels: pull-request-available > Fix For: 1.9.2, 1.9.3 > > > Currently, the description in the README file of PyFlink 1.9.x is not > suitable for publishing in PyPI. It should be changed to be more > user-friendly. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15908) Add description of support 'pip install' to 1.9.x documents
[ https://issues.apache.org/jira/browse/FLINK-15908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hequn Cheng updated FLINK-15908: Issue Type: Improvement (was: Bug) > Add description of support 'pip install' to 1.9.x documents > --- > > Key: FLINK-15908 > URL: https://issues.apache.org/jira/browse/FLINK-15908 > Project: Flink > Issue Type: Improvement > Components: Documentation >Reporter: sunjincheng >Assignee: Wei Zhong >Priority: Major > Labels: pull-request-available > Fix For: 1.9.3 > > Time Spent: 20m > Remaining Estimate: 0h > > Add Description of support 'pip install' to 1.9.x documents. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-15908) Add description of support 'pip install' to 1.9.x documents
[ https://issues.apache.org/jira/browse/FLINK-15908?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035969#comment-17035969 ] Hequn Cheng commented on FLINK-15908: - Resolved in 1.9.3 via e13deed508065a91de61fc95f889cfac4adc1416 > Add description of support 'pip install' to 1.9.x documents > --- > > Key: FLINK-15908 > URL: https://issues.apache.org/jira/browse/FLINK-15908 > Project: Flink > Issue Type: Bug > Components: Documentation >Reporter: sunjincheng >Assignee: Wei Zhong >Priority: Major > Labels: pull-request-available > Fix For: 1.9.3 > > Time Spent: 20m > Remaining Estimate: 0h > > Add Description of support 'pip install' to 1.9.x documents. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
flinkbot edited a comment on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078#issuecomment-585556257 ## CI report: * 389c6de14afd66e75ceff564cdeb2a5d70838c6d Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/14871) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Closed] (FLINK-15908) Add description of support 'pip install' to 1.9.x documents
[ https://issues.apache.org/jira/browse/FLINK-15908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hequn Cheng closed FLINK-15908. --- Resolution: Resolved > Add description of support 'pip install' to 1.9.x documents > --- > > Key: FLINK-15908 > URL: https://issues.apache.org/jira/browse/FLINK-15908 > Project: Flink > Issue Type: Bug > Components: Documentation >Reporter: sunjincheng >Assignee: Wei Zhong >Priority: Major > Labels: pull-request-available > Fix For: 1.9.3 > > Time Spent: 20m > Remaining Estimate: 0h > > Add Description of support 'pip install' to 1.9.x documents. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-16004) Exclude flink-rocksdb-state-memory-control-test jars from the dist
[ https://issues.apache.org/jira/browse/FLINK-16004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035967#comment-17035967 ] Yu Li commented on FLINK-16004: --- Thanks [~sewen] for the quick analysis and [~chesnay] for the quick action! > Exclude flink-rocksdb-state-memory-control-test jars from the dist > -- > > Key: FLINK-16004 > URL: https://issues.apache.org/jira/browse/FLINK-16004 > Project: Flink > Issue Type: Task > Components: Tests >Affects Versions: 1.10.0 >Reporter: Yu Li >Assignee: Chesnay Schepler >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently {{flink-rocksdb-state-memory-control-test}} will be included in the > dist as shown > [here|https://repository.apache.org/content/repositories/orgapacheflink-1333/org/apache/flink/flink-rocksdb-state-memory-control-test/]. > We should remove it. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly
flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly URL: https://github.com/apache/flink/pull/11070#issuecomment-585205728 ## CI report: * 8bdc077ea0813363d120eed404a2f4cd93ef0f9a Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148576486) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5107) * 991a08ce8916c79874682ae2d088d3e7ab3606a3 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148719940) * 99dad8370f26efe83241865b2f3ec4771fc1b436 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148722206) * 1fc7132f5b81cc353ce497c2fc6bc1e695abba25 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148726434) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5123) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-16031) Improve the description in the README file of PyFlink 1.9.x
Wei Zhong created FLINK-16031: - Summary: Improve the description in the README file of PyFlink 1.9.x Key: FLINK-16031 URL: https://issues.apache.org/jira/browse/FLINK-16031 Project: Flink Issue Type: Improvement Components: API / Python Affects Versions: 1.9.1 Reporter: Wei Zhong Fix For: 1.9.3, 1.9.2 Currently, the description in the README file of PyFlink 1.9.x is not suitable for publishing in PyPI. It should be changed to be more user-friendly. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value
flinkbot edited a comment on issue #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value URL: https://github.com/apache/flink/pull/11059#issuecomment-584620649 ## CI report: * 9cd9bb39ff403454de2681ddb77fc5168c0df0cc Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148370358) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5053) * 66c8cda07553d5114d3b1450e8804ac927630c3d Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148726426) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5122) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11051: [FLINK-15961][table-planner][table-planner-blink] Introduce Python Physical Correlate RelNodes which are containers for Python TableFunction
flinkbot edited a comment on issue #11051: [FLINK-15961][table-planner][table-planner-blink] Introduce Python Physical Correlate RelNodes which are containers for Python TableFunction URL: https://github.com/apache/flink/pull/11051#issuecomment-584123392 ## CI report: * 68b071496379b090212eb961b2026578ee2c64e3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148190648) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5010) * 9c17e490eca0cba3ca56c8094042b7e97bb217a4 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148726395) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5121) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] hequn8128 merged pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
hequn8128 merged pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
flinkbot edited a comment on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#issuecomment-583941299 ## CI report: * 507d7bb6076e4bc176c46f52e823762d264f3c28 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148130015) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4985) * e0be5057cfc067796c51c4917f5ca2c824ee7872 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148726376) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5120) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] hequn8128 commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
hequn8128 commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#issuecomment-585580060 Merging... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-16030) Add heartbeat between netty server and client to detect long connection alive
begginghard created FLINK-16030: --- Summary: Add heartbeat between netty server and client to detect long connection alive Key: FLINK-16030 URL: https://issues.apache.org/jira/browse/FLINK-16030 Project: Flink Issue Type: Improvement Components: Runtime / Network Affects Versions: 1.10.0 Reporter: begginghard Fix For: 1.10.1 Network can fail in many ways, sometimes pretty subtle (e.g. high ratio packet loss). When the long tcp connection between netty client and server is lost, the server would failed to send response to the client, then shut down the channel. At the same time, the netty client does not know that the connection has been disconnected, so it has been waiting for two hours. To detect the long tcp connection alive on netty client and server, we should have two ways: tcp keepalive and heartbeat. The tcp keepalive is 2 hours by default. When the long tcp connection dead, you continue to wait for 2 hours, the netty client will trigger exception and enter failover recovery. If you want to detect quickly, netty provides IdleStateHandler which it use ping-pang mechanism. If netty client sends continuously n ping message and receives no one pang message, then trigger exception. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] hequn8128 commented on a change in pull request #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
hequn8128 commented on a change in pull request #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#discussion_r378672265 ## File path: flink-python/src/main/java/org/apache/flink/table/runtime/operators/python/BaseRowPythonTableFunctionOperator.java ## @@ -0,0 +1,171 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.table.runtime.operators.python; + +import org.apache.flink.annotation.Internal; +import org.apache.flink.api.common.typeutils.TypeSerializer; +import org.apache.flink.configuration.Configuration; +import org.apache.flink.python.PythonFunctionRunner; +import org.apache.flink.python.env.PythonEnvironmentManager; +import org.apache.flink.table.api.TableConfig; +import org.apache.flink.table.dataformat.BaseRow; +import org.apache.flink.table.dataformat.BinaryRow; +import org.apache.flink.table.dataformat.GenericRow; +import org.apache.flink.table.dataformat.JoinedRow; +import org.apache.flink.table.functions.TableFunction; +import org.apache.flink.table.functions.python.PythonFunctionInfo; +import org.apache.flink.table.planner.codegen.CodeGeneratorContext; +import org.apache.flink.table.planner.codegen.ProjectionCodeGenerator; +import org.apache.flink.table.runtime.generated.GeneratedProjection; +import org.apache.flink.table.runtime.generated.Projection; +import org.apache.flink.table.runtime.runners.python.BaseRowPythonTableFunctionRunner; +import org.apache.flink.table.runtime.typeutils.BaseRowSerializer; +import org.apache.flink.table.runtime.typeutils.PythonTypeUtils; +import org.apache.flink.table.types.logical.RowType; + +import org.apache.beam.sdk.fn.data.FnDataReceiver; +import org.apache.calcite.rel.core.JoinRelType; + +import java.io.IOException; + +/** + * The Python {@link TableFunction} operator for the blink planner. + */ +@Internal +public class BaseRowPythonTableFunctionOperator + extends AbstractPythonTableFunctionOperator { + + + private static final long serialVersionUID = 1L; + + /** +* The collector used to collect records. +*/ + private transient StreamRecordBaseRowWrappingCollector baseRowWrapper; + + /** +* The JoinedRow reused holding the execution result. +*/ + private transient JoinedRow reuseJoinedRow; + + /** +* The Projection which projects the udtf input fields from the input row. +*/ + private transient Projection udtfInputProjection; + + /** +* The TypeSerializer for udtf execution results. +*/ + private transient TypeSerializer udtfOutputTypeSerializer; + + /** +* The type serializer for the forwarded fields. +*/ + private transient BaseRowSerializer forwardedInputSerializer; + + public BaseRowPythonTableFunctionOperator( + Configuration config, + PythonFunctionInfo tableFunction, + RowType inputType, + RowType outputType, + int[] udtfInputOffsets, + JoinRelType joinType) { + super(config, tableFunction, inputType, outputType, udtfInputOffsets, joinType); + } + + @Override + @SuppressWarnings("unchecked") + public void open() throws Exception { + super.open(); + baseRowWrapper = new StreamRecordBaseRowWrappingCollector(output); + reuseJoinedRow = new JoinedRow(); + + udtfInputProjection = createUdtfInputProjection(); + forwardedInputSerializer = new BaseRowSerializer(this.getExecutionConfig(), inputType); + udtfOutputTypeSerializer = PythonTypeUtils.toBlinkTypeSerializer(userDefinedFunctionOutputType); + } + + @Override + public void bufferInput(BaseRow input) { + // always copy the input BaseRow + BaseRow forwardedFields = forwardedInputSerializer.copy(input); + forwardedFields.setHeader(input.getHeader()); + forwardedInputQueue.add(input); Review comment: forwardedInputQueue.add(forwardedFields);
[GitHub] [flink] hequn8128 commented on a change in pull request #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
hequn8128 commented on a change in pull request #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#discussion_r378674085 ## File path: flink-python/src/main/java/org/apache/flink/table/runtime/operators/python/AbstractPythonTableFunctionOperator.java ## @@ -45,14 +47,21 @@ */ protected final PythonFunctionInfo tableFunction; + /** +* The correlate join type. +*/ + protected final JoinRelType joinType; + public AbstractPythonTableFunctionOperator( Configuration config, PythonFunctionInfo tableFunction, RowType inputType, RowType outputType, - int[] udtfInputOffsets) { + int[] udtfInputOffsets, + JoinRelType joinType) { super(config, inputType, outputType, udtfInputOffsets); this.tableFunction = Preconditions.checkNotNull(tableFunction); + this.joinType = Preconditions.checkNotNull(joinType); Review comment: Check inner and left. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] WeiZhong94 commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
WeiZhong94 commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#issuecomment-585576235 @hequn8128 Thanks for your review! I have updated this PR according to your comments. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Closed] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dian Fu closed FLINK-16026. --- Resolution: Fixed Merged to master via c40b0b232f4b04102b8220ced99cc4ae638a65df release-1.10 via b59e348c4d5ae2c5e2817eca768635f9d3794278 > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Assignee: Huang Xingbo >Priority: Critical > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 20m > Remaining Estimate: 0h > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-15948) Resource will be wasted when the task manager memory is not a multiple of Yarn minimum allocation
[ https://issues.apache.org/jira/browse/FLINK-15948?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035950#comment-17035950 ] Yang Wang commented on FLINK-15948: --- [~xintongsong] Thanks for your comments. Now i think it makes sense to me to add the exact division check in {{YarnClusterDescriptor#validateClusterResources}}. Currently, it is enough for our users. If they really want to get the container resource, they could refer to the Yarn ResourceManager web or rest api. > Resource will be wasted when the task manager memory is not a multiple of > Yarn minimum allocation > - > > Key: FLINK-15948 > URL: https://issues.apache.org/jira/browse/FLINK-15948 > Project: Flink > Issue Type: Bug > Components: Deployment / YARN >Affects Versions: 1.10.0 >Reporter: Yang Wang >Priority: Major > > If the {{taskmanager.memory.process.size}} is set to 2000m and the Yarn > minimum allocation is 128m, we will get a container with 2048m. Currently, > {{TaskExecutorProcessSpec}} is built with 2000m, so we will have 48m wasted > and they could not be used by Flink. > I think Flink has accounted all the jvm heap, off-heap, overhead resources. > So we should not leave these free memory there. And i suggest to update the > {{TaskExecutorProcessSpec}} according to the Yarn allocated container. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x.
flinkbot edited a comment on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x. URL: https://github.com/apache/flink/pull/11077#issuecomment-585548400 ## CI report: * 4ac010892eab3b0a44397b03aff0562c1fe407df Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148719956) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] dianfu closed pull request #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
dianfu closed pull request #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly
flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly URL: https://github.com/apache/flink/pull/11070#issuecomment-585205728 ## CI report: * 8bdc077ea0813363d120eed404a2f4cd93ef0f9a Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148576486) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5107) * 991a08ce8916c79874682ae2d088d3e7ab3606a3 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148719940) * 99dad8370f26efe83241865b2f3ec4771fc1b436 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148722206) * 1fc7132f5b81cc353ce497c2fc6bc1e695abba25 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value
flinkbot edited a comment on issue #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value URL: https://github.com/apache/flink/pull/11059#issuecomment-584620649 ## CI report: * 9cd9bb39ff403454de2681ddb77fc5168c0df0cc Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148370358) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5053) * 66c8cda07553d5114d3b1450e8804ac927630c3d UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11051: [FLINK-15961][table-planner][table-planner-blink] Introduce Python Physical Correlate RelNodes which are containers for Python TableFunction
flinkbot edited a comment on issue #11051: [FLINK-15961][table-planner][table-planner-blink] Introduce Python Physical Correlate RelNodes which are containers for Python TableFunction URL: https://github.com/apache/flink/pull/11051#issuecomment-584123392 ## CI report: * 68b071496379b090212eb961b2026578ee2c64e3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148190648) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5010) * 9c17e490eca0cba3ca56c8094042b7e97bb217a4 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
flinkbot edited a comment on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#issuecomment-583941299 ## CI report: * 507d7bb6076e4bc176c46f52e823762d264f3c28 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148130015) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4985) * e0be5057cfc067796c51c4917f5ca2c824ee7872 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] hequn8128 commented on a change in pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
hequn8128 commented on a change in pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#discussion_r378668403 ## File path: docs/tutorials/python_table_api.md ## @@ -31,10 +31,14 @@ to running a Python Table API program. ## Setting up a Python Project -Firstly, you can fire up your favorite IDE and create a Python project and then -you need to install the PyFlink package. Please -see [Build PyFlink]({{ site.baseurl }}/flinkDev/building.html#build-pyflink) -for more details about this. +You can begin by creating a Python project and installing the PyFlink package. +PyFlink is available via PyPi and can be easily installed using `pip`. Review comment: Add a link to Pypi This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] hequn8128 commented on a change in pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
hequn8128 commented on a change in pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#discussion_r378664262 ## File path: docs/tutorials/python_table_api.md ## @@ -31,10 +31,14 @@ to running a Python Table API program. ## Setting up a Python Project -Firstly, you can fire up your favorite IDE and create a Python project and then -you need to install the PyFlink package. Please -see [Build PyFlink]({{ site.baseurl }}/flinkDev/building.html#build-pyflink) -for more details about this. +You can begin by creating a Python project and installing the PyFlink package. +PyFlink is available via PyPi and can be easily installed using `pip`. + +{% highlight bash %} +$ python -m pip install apache-flink Review comment: For the version of 1.9 series, maybe it's better to specify the version, for example: ``` # install the latest 1.9 version of PyFlink python -m pip install apache-flink==1.9.* ``` The command will download the latest 1.9 pyflink. What do you think? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] hequn8128 commented on a change in pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
hequn8128 commented on a change in pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#discussion_r378664746 ## File path: docs/tutorials/python_table_api.zh.md ## @@ -30,8 +30,8 @@ under the License. ## 创建一个Python Table API项目 -首先,你可以使用你最熟悉的IDE,创建一个Python项目。然后,你需要安装PyFlink包, -请参考[构建PyFlink]({{ site.baseurl }}/zh/flinkDev/building.html#build-pyflink)了解详细信息。 +首先,使用您最熟悉的IDE创建一个Python项目。之后执行命令`python -m pip install apache-flink`从PyPI下载安装PyFlink包。 Review comment: ditto This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-16029) Remove register source and sink in test cases of planner
Zhenghua Gao created FLINK-16029: Summary: Remove register source and sink in test cases of planner Key: FLINK-16029 URL: https://issues.apache.org/jira/browse/FLINK-16029 Project: Flink Issue Type: Sub-task Reporter: Zhenghua Gao Many test cases of planner use TableEnvironement.registerTableSource() and registerTableSink() which should be avoid。We want to refactor these cases via TableEnvironment.connect(). -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-16023) jdbc connector's 'connector.table' property should be optional rather than required
[ https://issues.apache.org/jira/browse/FLINK-16023?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035943#comment-17035943 ] Jark Wu commented on FLINK-16023: - You can just set the mysql table's name to both the 'connector.table' and the registered name. Then what's the problem for current design? [~phoenixjiangnan] > jdbc connector's 'connector.table' property should be optional rather than > required > --- > > Key: FLINK-16023 > URL: https://issues.apache.org/jira/browse/FLINK-16023 > Project: Flink > Issue Type: Improvement > Components: Connectors / JDBC >Reporter: Bowen Li >Assignee: Jingsong Lee >Priority: Major > Fix For: 1.11.0 > > > jdbc connector's 'connector.table' property should be optional rather than > required. > connector should assume the table name in dbms is the same as that in Flink > when this property is not present > The fundamental reason is that such a design didn't consider integration with > catalogs. Once introduced catalog, the flink table's name should be just the > 'table''s name in corresponding external system. > cc [~ykt836] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Comment Edited] (FLINK-16013) List and map config options could not be parsed correctly
[ https://issues.apache.org/jira/browse/FLINK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035904#comment-17035904 ] Yang Wang edited comment on FLINK-16013 at 2/13/20 6:00 AM: [~dwysakowicz] Thanks for your suggestion. It really makes sense to put the converting logic in {{convertToString}}. was (Author: fly_in_gis): [~dwysakowicz] Thanks for your suggestion. It really makes sense to put the converting {{List}} to string in {{convertToString}}. > List and map config options could not be parsed correctly > - > > Key: FLINK-16013 > URL: https://issues.apache.org/jira/browse/FLINK-16013 > Project: Flink > Issue Type: Bug > Components: Runtime / Configuration >Affects Versions: 1.10.0 >Reporter: Yang Wang >Assignee: Yang Wang >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently, if a config option is {{List}} type and written to a > flink-conf.yaml, it could not be parsed correctly when reloaded from yaml > resource. The root cause is we use {{List#toString}} to save into the yaml > resource. However, when we want to parse a List from a string, we use > semicolon to split the value. > > Also the Map, Duration type have the same problem. > > The following is a unit test to reproduce this problem. > {code:java} > public void testWriteConfigurationAndReload() throws IOException { > final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); > final Configuration flinkConfig = new Configuration(); > final ConfigOption> listConfigOption = ConfigOptions > .key("test-list-string-key") > .stringType() > .asList() > .noDefaultValue(); > final List values = Arrays.asList("value1", "value2", "value3"); > flinkConfig.set(listConfigOption, values); > assertThat(values, > Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); > BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, > "flink-conf.yaml")); > final Configuration loadedFlinkConfig = > GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); > assertThat(values, > Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-16013) List and map config options could not be parsed correctly
[ https://issues.apache.org/jira/browse/FLINK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Wang updated FLINK-16013: -- Description: Currently, if a config option is {{List}} type and written to a flink-conf.yaml, it could not be parsed correctly when reloaded from yaml resource. The root cause is we use {{List#toString}} to save into the yaml resource. However, when we want to parse a List from a string, we use semicolon to split the value. Also the Map, Duration type have the same problem. The following is a unit test to reproduce this problem. {code:java} public void testWriteConfigurationAndReload() throws IOException { final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); final Configuration flinkConfig = new Configuration(); final ConfigOption> listConfigOption = ConfigOptions .key("test-list-string-key") .stringType() .asList() .noDefaultValue(); final List values = Arrays.asList("value1", "value2", "value3"); flinkConfig.set(listConfigOption, values); assertThat(values, Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, "flink-conf.yaml")); final Configuration loadedFlinkConfig = GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); assertThat(values, Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); } {code} was: Currently, if a config option is {{List}} type and written to a flink-conf.yaml, it could not be parsed correctly when reloaded from yaml resource. The root cause is we use {{List#toString}} to save into the yaml resource. However, when we want to parse a List from a string, we use semicolon to split the value. The following is a unit test to reproduce this problem. {code:java} public void testWriteConfigurationAndReload() throws IOException { final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); final Configuration flinkConfig = new Configuration(); final ConfigOption> listConfigOption = ConfigOptions .key("test-list-string-key") .stringType() .asList() .noDefaultValue(); final List values = Arrays.asList("value1", "value2", "value3"); flinkConfig.set(listConfigOption, values); assertThat(values, Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, "flink-conf.yaml")); final Configuration loadedFlinkConfig = GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); assertThat(values, Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); } {code} > List and map config options could not be parsed correctly > - > > Key: FLINK-16013 > URL: https://issues.apache.org/jira/browse/FLINK-16013 > Project: Flink > Issue Type: Bug > Components: Runtime / Configuration >Affects Versions: 1.10.0 >Reporter: Yang Wang >Assignee: Yang Wang >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently, if a config option is {{List}} type and written to a > flink-conf.yaml, it could not be parsed correctly when reloaded from yaml > resource. The root cause is we use {{List#toString}} to save into the yaml > resource. However, when we want to parse a List from a string, we use > semicolon to split the value. > > Also the Map, Duration type have the same problem. > > The following is a unit test to reproduce this problem. > {code:java} > public void testWriteConfigurationAndReload() throws IOException { > final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); > final Configuration flinkConfig = new Configuration(); > final ConfigOption> listConfigOption = ConfigOptions > .key("test-list-string-key") > .stringType() > .asList() > .noDefaultValue(); > final List values = Arrays.asList("value1", "value2", "value3"); > flinkConfig.set(listConfigOption, values); > assertThat(values, > Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); > BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, > "flink-conf.yaml")); > final Configuration loadedFlinkConfig = > GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); > assertThat(values, > Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] wangyang0918 commented on issue #11070: [FLINK-16013][core] Write and parse list config option correctly
wangyang0918 commented on issue #11070: [FLINK-16013][core] Write and parse list config option correctly URL: https://github.com/apache/flink/pull/11070#issuecomment-585565086 @kl0u Thanks for your comments. Moving the converting logic to `Configuration#convertToString()` is a better choice. I have updated the PR according this. BTW, i think we could not avoid `instanceof List`. Because the value could be any `List`(e.g.`ArrayList`, `LinkedList`, etc.). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] rkhachatryan commented on issue #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling
rkhachatryan commented on issue #11071: [FLINK-16019][runtime] fix ContinuousFileReaderOperator error handling URL: https://github.com/apache/flink/pull/11071#issuecomment-585564593 CI failure is unrelated: https://issues.apache.org/jira/browse/FLINK-16026 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
flinkbot edited a comment on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078#issuecomment-585556257 ## CI report: * 389c6de14afd66e75ceff564cdeb2a5d70838c6d Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/14871) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming
flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming URL: https://github.com/apache/flink/pull/11069#issuecomment-585181372 ## CI report: * 059e816c9f5973563a4f03e4c6af2e4af4729b66 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148567365) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5105) * 3cf0448fa909468f6c2d5508fe527be436ef475f Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148715451) * 578e1424988f95f022c3fbd558b54d5fe8ffe84c Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148717843) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly
flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly URL: https://github.com/apache/flink/pull/11070#issuecomment-585205728 ## CI report: * 8bdc077ea0813363d120eed404a2f4cd93ef0f9a Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148576486) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5107) * 991a08ce8916c79874682ae2d088d3e7ab3606a3 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148719940) * 99dad8370f26efe83241865b2f3ec4771fc1b436 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148722206) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] HuangXingBo commented on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
HuangXingBo commented on issue #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#issuecomment-585560961 Thanks a lot for @hequn8128 review, I have addressed the comments at the latest commit. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Comment Edited] (FLINK-16013) List and map config options could not be parsed correctly
[ https://issues.apache.org/jira/browse/FLINK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035904#comment-17035904 ] Yang Wang edited comment on FLINK-16013 at 2/13/20 5:40 AM: [~dwysakowicz] Thanks for your suggestion. It really makes sense to put the converting {{List}} to string in {{convertToString}}. was (Author: fly_in_gis): [~dwysakowicz] Thanks for your suggestion. It really makes sense to put the converting {{List}} to string in {{convertToString}}. Also i find that not only the {{List}}, also the {{Map}} could not be parsed correctly. Other complex types {{Enum}}, {{Duration}}, {{MemorySize}} work well. > List and map config options could not be parsed correctly > - > > Key: FLINK-16013 > URL: https://issues.apache.org/jira/browse/FLINK-16013 > Project: Flink > Issue Type: Bug > Components: Runtime / Configuration >Affects Versions: 1.10.0 >Reporter: Yang Wang >Assignee: Yang Wang >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently, if a config option is {{List}} type and written to a > flink-conf.yaml, it could not be parsed correctly when reloaded from yaml > resource. The root cause is we use {{List#toString}} to save into the yaml > resource. However, when we want to parse a List from a string, we use > semicolon to split the value. > > The following is a unit test to reproduce this problem. > {code:java} > public void testWriteConfigurationAndReload() throws IOException { > final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); > final Configuration flinkConfig = new Configuration(); > final ConfigOption> listConfigOption = ConfigOptions > .key("test-list-string-key") > .stringType() > .asList() > .noDefaultValue(); > final List values = Arrays.asList("value1", "value2", "value3"); > flinkConfig.set(listConfigOption, values); > assertThat(values, > Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); > BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, > "flink-conf.yaml")); > final Configuration loadedFlinkConfig = > GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); > assertThat(values, > Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files
flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files URL: https://github.com/apache/flink/pull/7702#issuecomment-572195960 ## CI report: * 72dd07f5f10a56adf6025e82083af21ada47c711 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/143614040) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4198) * f2387288cb33f288164ed9d102b47868a93dc898 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148710964) Azure: [CANCELED](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5118) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] HuangXingBo commented on a change in pull request #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner
HuangXingBo commented on a change in pull request #11044: [FLINK-15913][python] Add Python TableFunction Runner and Operator in Blink planner URL: https://github.com/apache/flink/pull/11044#discussion_r378658794 ## File path: flink-python/src/main/java/org/apache/flink/table/runtime/operators/python/BaseRowPythonTableFunctionOperator.java ## @@ -0,0 +1,148 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.table.runtime.operators.python; + +import org.apache.flink.annotation.Internal; +import org.apache.flink.api.common.typeutils.TypeSerializer; +import org.apache.flink.configuration.Configuration; +import org.apache.flink.python.PythonFunctionRunner; +import org.apache.flink.python.env.PythonEnvironmentManager; +import org.apache.flink.table.api.TableConfig; +import org.apache.flink.table.dataformat.BaseRow; +import org.apache.flink.table.dataformat.BinaryRow; +import org.apache.flink.table.dataformat.JoinedRow; +import org.apache.flink.table.functions.TableFunction; +import org.apache.flink.table.functions.python.PythonFunctionInfo; +import org.apache.flink.table.planner.codegen.CodeGeneratorContext; +import org.apache.flink.table.planner.codegen.ProjectionCodeGenerator; +import org.apache.flink.table.runtime.generated.GeneratedProjection; +import org.apache.flink.table.runtime.generated.Projection; +import org.apache.flink.table.runtime.runners.python.BaseRowPythonTableFunctionRunner; +import org.apache.flink.table.runtime.typeutils.PythonTypeUtils; +import org.apache.flink.table.types.logical.RowType; + +import org.apache.beam.sdk.fn.data.FnDataReceiver; + +import java.io.IOException; + +/** + * The Python {@link TableFunction} operator for the blink planner. + */ +@Internal +public class BaseRowPythonTableFunctionOperator + extends AbstractPythonTableFunctionOperator { + + + private static final long serialVersionUID = 1L; + + /** +* The collector used to collect records. +*/ + private transient StreamRecordBaseRowWrappingCollector baseRowWrapper; + + /** +* The JoinedRow reused holding the execution result. +*/ + private transient JoinedRow reuseJoinedRow; + + /** +* The Projection which projects the udtf input fields from the input row. +*/ + private transient Projection udtfInputProjection; + + /** +* The TypeSerializer for udtf execution results. +*/ + private transient TypeSerializer udtfOutputTypeSerializer; + + public BaseRowPythonTableFunctionOperator( + Configuration config, + PythonFunctionInfo tableFunction, + RowType inputType, + RowType outputType, + int[] udtfInputOffsets) { + super(config, tableFunction, inputType, outputType, udtfInputOffsets); + } + + @Override + @SuppressWarnings("unchecked") + public void open() throws Exception { + super.open(); + baseRowWrapper = new StreamRecordBaseRowWrappingCollector(output); + reuseJoinedRow = new JoinedRow(); + + udtfInputProjection = createUdtfInputProjection(); + udtfOutputTypeSerializer = PythonTypeUtils.toBlinkTypeSerializer(userDefinedFunctionOutputType); + } + + @Override + public void bufferInput(BaseRow input) { + forwardedInputQueue.add(input); + } + + @Override + public BaseRow getUdfInput(BaseRow element) { + return udtfInputProjection.apply(element); + } + + @Override + public PythonFunctionRunner createPythonFunctionRunner( + FnDataReceiver resultReceiver, + PythonEnvironmentManager pythonEnvironmentManager) { + return new BaseRowPythonTableFunctionRunner( + getRuntimeContext().getTaskName(), + resultReceiver, + tableFunction, + pythonEnvironmentManager, + userDefinedFunctionInputType, + userDefinedFunctionOutputType); + } + + private Projection
[GitHub] [flink] flinkbot commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
flinkbot commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078#issuecomment-585556257 ## CI report: * 389c6de14afd66e75ceff564cdeb2a5d70838c6d UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
flinkbot edited a comment on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#issuecomment-585541734 ## CI report: * e2fc04377969fe59ba6de22e47d93050ead27d6a Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148717893) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly
flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly URL: https://github.com/apache/flink/pull/11070#issuecomment-585205728 ## CI report: * 8bdc077ea0813363d120eed404a2f4cd93ef0f9a Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148576486) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5107) * 991a08ce8916c79874682ae2d088d3e7ab3606a3 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148719940) * 99dad8370f26efe83241865b2f3ec4771fc1b436 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x.
flinkbot edited a comment on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x. URL: https://github.com/apache/flink/pull/11077#issuecomment-585548400 ## CI report: * 4ac010892eab3b0a44397b03aff0562c1fe407df Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148719956) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] dianfu commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
dianfu commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078#issuecomment-585552968 LGTM. Will merge the PR once the travis turns green. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] dianfu commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
dianfu commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078#issuecomment-585552797 @flinkbot approve all This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035921#comment-17035921 ] Dian Fu commented on FLINK-16026: - [~lzljs3620320] Thanks for reporting this issue. [~hxbks2ks] Thanks for the analysis and fix. +1 to limit the version of avro-python3 in PyFlink. > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Assignee: Huang Xingbo >Priority: Critical > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-16023) jdbc connector's 'connector.table' property should be optional rather than required
[ https://issues.apache.org/jira/browse/FLINK-16023?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035919#comment-17035919 ] Bowen Li commented on FLINK-16023: -- [~jark] The fundamental reason is that such a design didn't consider integration with catalogs. Once introduced catalog, the flink table's name should be just the 'table''s name in corresponding external system. this is not specific to jdbc, but general to kafka and hbase sql connector as well. I just created FLINK-16027 and FLINK-16028 to track them > jdbc connector's 'connector.table' property should be optional rather than > required > --- > > Key: FLINK-16023 > URL: https://issues.apache.org/jira/browse/FLINK-16023 > Project: Flink > Issue Type: Improvement > Components: Connectors / JDBC >Reporter: Bowen Li >Assignee: Jingsong Lee >Priority: Major > Fix For: 1.11.0 > > > jdbc connector's 'connector.table' property should be optional rather than > required. > connector should assume the table name in dbms is the same as that in Flink > when this property is not present > The fundamental reason is that such a design didn't consider integration with > catalogs. Once introduced catalog, the flink table's name should be just the > 'table''s name in corresponding external system. > cc [~ykt836] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (FLINK-16028) hbase connector's 'connector.table-name' property should be optional rather than required
Bowen Li created FLINK-16028: Summary: hbase connector's 'connector.table-name' property should be optional rather than required Key: FLINK-16028 URL: https://issues.apache.org/jira/browse/FLINK-16028 Project: Flink Issue Type: Improvement Reporter: Bowen Li cc [~lzljs3620320] [~jark] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
flinkbot commented on issue #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078#issuecomment-585551836 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit 389c6de14afd66e75ceff564cdeb2a5d70838c6d (Thu Feb 13 05:02:44 UTC 2020) **Warnings:** * No documentation files were touched! Remember to keep the Flink docs up to date! Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-16026: --- Labels: pull-request-available (was: ) > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Assignee: Huang Xingbo >Priority: Critical > Labels: pull-request-available > Fix For: 1.11.0 > > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dian Fu updated FLINK-16026: Fix Version/s: 1.10.1 > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Assignee: Huang Xingbo >Priority: Critical > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (FLINK-16027) kafka connector's 'connector.topic' property should be optional rather than required
Bowen Li created FLINK-16027: Summary: kafka connector's 'connector.topic' property should be optional rather than required Key: FLINK-16027 URL: https://issues.apache.org/jira/browse/FLINK-16027 Project: Flink Issue Type: Improvement Components: Connectors / Kafka Reporter: Bowen Li Assignee: Jingsong Lee Fix For: 1.11.0 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] HuangXingBo opened a new pull request #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink
HuangXingBo opened a new pull request #11078: [FLINK-16026][python] Limits the avro-python3 version in Flink URL: https://github.com/apache/flink/pull/11078 ## What is the purpose of the change *This pull request will limit the avro-python3 version in Flink* ## Brief change log - *limit the avro-python3 version* ## Verifying this change This change added tests and can be verified as follows: *tox test* ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): (no) - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (no) - The serializers: (no) - The runtime per-record code paths (performance sensitive): (no) - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no) - The S3 file system connector: (no) ## Documentation - Does this pull request introduce a new feature? (no) - If yes, how is the feature documented? (not applicable) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Assigned] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dian Fu reassigned FLINK-16026: --- Assignee: Dian Fu > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Assignee: Dian Fu >Priority: Critical > Fix For: 1.11.0 > > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dian Fu reassigned FLINK-16026: --- Assignee: Huang Xingbo (was: Dian Fu) > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Assignee: Huang Xingbo >Priority: Critical > Fix For: 1.11.0 > > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-16023) jdbc connector's 'connector.table' property should be optional rather than required
[ https://issues.apache.org/jira/browse/FLINK-16023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-16023: - Description: jdbc connector's 'connector.table' property should be optional rather than required. connector should assume the table name in dbms is the same as that in Flink when this property is not present The fundamental reason is that such a design didn't consider integration with catalogs. Once introduced catalog, the flink table's name should be just the 'table''s name in corresponding external system. cc [~ykt836] was: jdbc connector's 'connector.table' property should be optional rather than required. connector should assume the table name in dbms is the same as that in Flink when this property is not present cc [~ykt836] > jdbc connector's 'connector.table' property should be optional rather than > required > --- > > Key: FLINK-16023 > URL: https://issues.apache.org/jira/browse/FLINK-16023 > Project: Flink > Issue Type: Improvement > Components: Connectors / JDBC >Reporter: Bowen Li >Assignee: Jingsong Lee >Priority: Major > Fix For: 1.11.0 > > > jdbc connector's 'connector.table' property should be optional rather than > required. > connector should assume the table name in dbms is the same as that in Flink > when this property is not present > The fundamental reason is that such a design didn't consider integration with > catalogs. Once introduced catalog, the flink table's name should be just the > 'table''s name in corresponding external system. > cc [~ykt836] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-16026) Travis failed due to python setup
[ https://issues.apache.org/jira/browse/FLINK-16026?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035916#comment-17035916 ] Huang Xingbo commented on FLINK-16026: -- This issue is caused by the latest released avro package. This package is broken and it has already been discussed in avro community to release a patched version(AVRO-2737). However, I'm not sure when the new package will be available and I suggest to limit the avro-python3 version in Flink to work around this issue. I will submit a PR ASAP. > Travis failed due to python setup > - > > Key: FLINK-16026 > URL: https://issues.apache.org/jira/browse/FLINK-16026 > Project: Flink > Issue Type: Bug > Components: API / Python >Reporter: Jingsong Lee >Priority: Critical > Fix For: 1.11.0 > > > [https://api.travis-ci.com/v3/job/286671652/log.txt] > [https://api.travis-ci.org/v3/job/649754603/log.txt] > [https://api.travis-ci.com/v3/job/286409130/log.txt] > Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from > apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached > https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz > Complete output from command python setup.py egg_info: Traceback (most > recent call last): File "", line 1, in File > "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in > import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' > Command "python setup.py egg_info" > failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are > using pip version 10.0.1, however version 20.0.2 is available. You should > consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x.
flinkbot commented on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x. URL: https://github.com/apache/flink/pull/11077#issuecomment-585548400 ## CI report: * 4ac010892eab3b0a44397b03aff0562c1fe407df UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
flinkbot edited a comment on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#issuecomment-585541734 ## CI report: * e2fc04377969fe59ba6de22e47d93050ead27d6a Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148717893) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming
flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming URL: https://github.com/apache/flink/pull/11069#issuecomment-585181372 ## CI report: * 059e816c9f5973563a4f03e4c6af2e4af4729b66 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148567365) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5105) * 3cf0448fa909468f6c2d5508fe527be436ef475f Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148715451) * 578e1424988f95f022c3fbd558b54d5fe8ffe84c Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148717843) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly
flinkbot edited a comment on issue #11070: [FLINK-16013][core] Write and parse list config option correctly URL: https://github.com/apache/flink/pull/11070#issuecomment-585205728 ## CI report: * 8bdc077ea0813363d120eed404a2f4cd93ef0f9a Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148576486) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5107) * 991a08ce8916c79874682ae2d088d3e7ab3606a3 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions
JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions URL: https://github.com/apache/flink/pull/11034#discussion_r378646800 ## File path: flink-table/flink-table-planner-blink/src/main/java/org/apache/calcite/sql/validate/ProcedureNamespace.java ## @@ -0,0 +1,80 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to you under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.calcite.sql.validate; + +import org.apache.flink.annotation.Internal; +import org.apache.flink.table.planner.functions.utils.TableSqlFunction; + +import org.apache.calcite.rel.type.RelDataType; +import org.apache.calcite.sql.SqlCall; +import org.apache.calcite.sql.SqlCallBinding; +import org.apache.calcite.sql.SqlKind; +import org.apache.calcite.sql.SqlNode; +import org.apache.calcite.sql.SqlOperator; +import org.apache.calcite.sql.SqlUtil; +import org.apache.calcite.sql.type.SqlTypeName; + +/** + * Namespace whose contents are defined by the result of a call to a user-defined procedure. + * + * Note: Compared to Calcite, this class implements custom logic for dealing with collection tables + * like {@code TABLE(function(...))} procedures. Compared to the SQL standard, Flink's table functions + * can return arbitrary types that are wrapped into a ROW type if necessary. We don't interpret ARRAY + * or MULTISET types as it would be standard. + */ +@Internal +public final class ProcedureNamespace extends AbstractNamespace { + + private final SqlValidatorScope scope; + + private final SqlCall call; + + ProcedureNamespace( + SqlValidatorImpl validator, + SqlValidatorScope scope, + SqlCall call, + SqlNode enclosingNode) { + super(validator, enclosingNode); + this.scope = scope; + this.call = call; + } + + public RelDataType validateImpl(RelDataType targetRowType) { + validator.inferUnknownTypes(validator.unknownType, scope, call); + final RelDataType type = validator.deriveTypeImpl(scope, call); + final SqlOperator operator = call.getOperator(); + final SqlCallBinding callBinding = + new SqlCallBinding(validator, scope, call); + // legacy table functions + if (operator instanceof TableSqlFunction) { Review comment: It seems that we can not use `TableSqlFunction`, because `TableSqlFunction`s are different between legacy planner and blink planner. This lead to we have different `ProcedureNamespace` codes between legacy planner and blink planner. This will lead to bugs. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions
JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions URL: https://github.com/apache/flink/pull/11034#discussion_r378646800 ## File path: flink-table/flink-table-planner-blink/src/main/java/org/apache/calcite/sql/validate/ProcedureNamespace.java ## @@ -0,0 +1,80 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to you under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.calcite.sql.validate; + +import org.apache.flink.annotation.Internal; +import org.apache.flink.table.planner.functions.utils.TableSqlFunction; + +import org.apache.calcite.rel.type.RelDataType; +import org.apache.calcite.sql.SqlCall; +import org.apache.calcite.sql.SqlCallBinding; +import org.apache.calcite.sql.SqlKind; +import org.apache.calcite.sql.SqlNode; +import org.apache.calcite.sql.SqlOperator; +import org.apache.calcite.sql.SqlUtil; +import org.apache.calcite.sql.type.SqlTypeName; + +/** + * Namespace whose contents are defined by the result of a call to a user-defined procedure. + * + * Note: Compared to Calcite, this class implements custom logic for dealing with collection tables + * like {@code TABLE(function(...))} procedures. Compared to the SQL standard, Flink's table functions + * can return arbitrary types that are wrapped into a ROW type if necessary. We don't interpret ARRAY + * or MULTISET types as it would be standard. + */ +@Internal +public final class ProcedureNamespace extends AbstractNamespace { + + private final SqlValidatorScope scope; + + private final SqlCall call; + + ProcedureNamespace( + SqlValidatorImpl validator, + SqlValidatorScope scope, + SqlCall call, + SqlNode enclosingNode) { + super(validator, enclosingNode); + this.scope = scope; + this.call = call; + } + + public RelDataType validateImpl(RelDataType targetRowType) { + validator.inferUnknownTypes(validator.unknownType, scope, call); + final RelDataType type = validator.deriveTypeImpl(scope, call); + final SqlOperator operator = call.getOperator(); + final SqlCallBinding callBinding = + new SqlCallBinding(validator, scope, call); + // legacy table functions + if (operator instanceof TableSqlFunction) { Review comment: It seems that we can use `TableSqlFunction`, because `TableSqlFunction`s are different between legacy planner and blink planner. This lead to we have different `ProcedureNamespace` codes in between legacy planner and blink planner. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions
JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions URL: https://github.com/apache/flink/pull/11034#discussion_r378646800 ## File path: flink-table/flink-table-planner-blink/src/main/java/org/apache/calcite/sql/validate/ProcedureNamespace.java ## @@ -0,0 +1,80 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to you under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.calcite.sql.validate; + +import org.apache.flink.annotation.Internal; +import org.apache.flink.table.planner.functions.utils.TableSqlFunction; + +import org.apache.calcite.rel.type.RelDataType; +import org.apache.calcite.sql.SqlCall; +import org.apache.calcite.sql.SqlCallBinding; +import org.apache.calcite.sql.SqlKind; +import org.apache.calcite.sql.SqlNode; +import org.apache.calcite.sql.SqlOperator; +import org.apache.calcite.sql.SqlUtil; +import org.apache.calcite.sql.type.SqlTypeName; + +/** + * Namespace whose contents are defined by the result of a call to a user-defined procedure. + * + * Note: Compared to Calcite, this class implements custom logic for dealing with collection tables + * like {@code TABLE(function(...))} procedures. Compared to the SQL standard, Flink's table functions + * can return arbitrary types that are wrapped into a ROW type if necessary. We don't interpret ARRAY + * or MULTISET types as it would be standard. + */ +@Internal +public final class ProcedureNamespace extends AbstractNamespace { + + private final SqlValidatorScope scope; + + private final SqlCall call; + + ProcedureNamespace( + SqlValidatorImpl validator, + SqlValidatorScope scope, + SqlCall call, + SqlNode enclosingNode) { + super(validator, enclosingNode); + this.scope = scope; + this.call = call; + } + + public RelDataType validateImpl(RelDataType targetRowType) { + validator.inferUnknownTypes(validator.unknownType, scope, call); + final RelDataType type = validator.deriveTypeImpl(scope, call); + final SqlOperator operator = call.getOperator(); + final SqlCallBinding callBinding = + new SqlCallBinding(validator, scope, call); + // legacy table functions + if (operator instanceof TableSqlFunction) { Review comment: It seems that we can use `TableSqlFunction`, because `TableSqlFunction`s are different between legacy planner and blink planner. This lead to we have different `ProcedureNamespace` codes between legacy planner and blink planner. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions
JingsongLi commented on a change in pull request #11034: [FLINK-15802][table] Support new type inference for table functions URL: https://github.com/apache/flink/pull/11034#discussion_r378646800 ## File path: flink-table/flink-table-planner-blink/src/main/java/org/apache/calcite/sql/validate/ProcedureNamespace.java ## @@ -0,0 +1,80 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to you under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.calcite.sql.validate; + +import org.apache.flink.annotation.Internal; +import org.apache.flink.table.planner.functions.utils.TableSqlFunction; + +import org.apache.calcite.rel.type.RelDataType; +import org.apache.calcite.sql.SqlCall; +import org.apache.calcite.sql.SqlCallBinding; +import org.apache.calcite.sql.SqlKind; +import org.apache.calcite.sql.SqlNode; +import org.apache.calcite.sql.SqlOperator; +import org.apache.calcite.sql.SqlUtil; +import org.apache.calcite.sql.type.SqlTypeName; + +/** + * Namespace whose contents are defined by the result of a call to a user-defined procedure. + * + * Note: Compared to Calcite, this class implements custom logic for dealing with collection tables + * like {@code TABLE(function(...))} procedures. Compared to the SQL standard, Flink's table functions + * can return arbitrary types that are wrapped into a ROW type if necessary. We don't interpret ARRAY + * or MULTISET types as it would be standard. + */ +@Internal +public final class ProcedureNamespace extends AbstractNamespace { + + private final SqlValidatorScope scope; + + private final SqlCall call; + + ProcedureNamespace( + SqlValidatorImpl validator, + SqlValidatorScope scope, + SqlCall call, + SqlNode enclosingNode) { + super(validator, enclosingNode); + this.scope = scope; + this.call = call; + } + + public RelDataType validateImpl(RelDataType targetRowType) { + validator.inferUnknownTypes(validator.unknownType, scope, call); + final RelDataType type = validator.deriveTypeImpl(scope, call); + final SqlOperator operator = call.getOperator(); + final SqlCallBinding callBinding = + new SqlCallBinding(validator, scope, call); + // legacy table functions + if (operator instanceof TableSqlFunction) { Review comment: It seems that we can use `TableSqlFunction`, because `TableSqlFunction`s are different between legacy planner and blink planner. This lead to we have different codes in between legacy planner and blink planner. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-16026) Travis failed due to python setup
Jingsong Lee created FLINK-16026: Summary: Travis failed due to python setup Key: FLINK-16026 URL: https://issues.apache.org/jira/browse/FLINK-16026 Project: Flink Issue Type: Bug Components: API / Python Reporter: Jingsong Lee Fix For: 1.11.0 [https://api.travis-ci.com/v3/job/286671652/log.txt] [https://api.travis-ci.org/v3/job/649754603/log.txt] [https://api.travis-ci.com/v3/job/286409130/log.txt] Collecting avro-python3<2.0.0,>=1.8.1; python_version >= "3.0" (from apache-beam==2.19.0->apache-flink==1.11.dev0) Using cached https://files.pythonhosted.org/packages/31/21/d98e2515e5ca0337d7e747e8065227ee77faf5c817bbb74391899613178a/avro-python3-1.9.2.tar.gz Complete output from command python setup.py egg_info: Traceback (most recent call last): File "", line 1, in File "/tmp/pip-install-d6uvsl_b/avro-python3/setup.py", line 41, in import pycodestyle ModuleNotFoundError: No module named 'pycodestyle' Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-install-d6uvsl_b/avro-python3/ You are using pip version 10.0.1, however version 20.0.2 is available. You should consider upgrading via the 'pip install --upgrade pip' command. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-16013) List and map config options could not be parsed correctly
[ https://issues.apache.org/jira/browse/FLINK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Wang updated FLINK-16013: -- Summary: List and map config options could not be parsed correctly (was: List and map config option could not be parsed correctly) > List and map config options could not be parsed correctly > - > > Key: FLINK-16013 > URL: https://issues.apache.org/jira/browse/FLINK-16013 > Project: Flink > Issue Type: Bug > Components: Runtime / Configuration >Affects Versions: 1.10.0 >Reporter: Yang Wang >Assignee: Yang Wang >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently, if a config option is {{List}} type and written to a > flink-conf.yaml, it could not be parsed correctly when reloaded from yaml > resource. The root cause is we use {{List#toString}} to save into the yaml > resource. However, when we want to parse a List from a string, we use > semicolon to split the value. > > The following is a unit test to reproduce this problem. > {code:java} > public void testWriteConfigurationAndReload() throws IOException { > final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); > final Configuration flinkConfig = new Configuration(); > final ConfigOption> listConfigOption = ConfigOptions > .key("test-list-string-key") > .stringType() > .asList() > .noDefaultValue(); > final List values = Arrays.asList("value1", "value2", "value3"); > flinkConfig.set(listConfigOption, values); > assertThat(values, > Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); > BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, > "flink-conf.yaml")); > final Configuration loadedFlinkConfig = > GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); > assertThat(values, > Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-16013) List and map config option could not be parsed correctly
[ https://issues.apache.org/jira/browse/FLINK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Wang updated FLINK-16013: -- Summary: List and map config option could not be parsed correctly (was: List config option could not be parsed correctly) > List and map config option could not be parsed correctly > > > Key: FLINK-16013 > URL: https://issues.apache.org/jira/browse/FLINK-16013 > Project: Flink > Issue Type: Bug > Components: Runtime / Configuration >Affects Versions: 1.10.0 >Reporter: Yang Wang >Assignee: Yang Wang >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently, if a config option is {{List}} type and written to a > flink-conf.yaml, it could not be parsed correctly when reloaded from yaml > resource. The root cause is we use {{List#toString}} to save into the yaml > resource. However, when we want to parse a List from a string, we use > semicolon to split the value. > > The following is a unit test to reproduce this problem. > {code:java} > public void testWriteConfigurationAndReload() throws IOException { > final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); > final Configuration flinkConfig = new Configuration(); > final ConfigOption> listConfigOption = ConfigOptions > .key("test-list-string-key") > .stringType() > .asList() > .noDefaultValue(); > final List values = Arrays.asList("value1", "value2", "value3"); > flinkConfig.set(listConfigOption, values); > assertThat(values, > Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); > BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, > "flink-conf.yaml")); > final Configuration loadedFlinkConfig = > GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); > assertThat(values, > Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-16013) List config option could not be parsed correctly
[ https://issues.apache.org/jira/browse/FLINK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035904#comment-17035904 ] Yang Wang commented on FLINK-16013: --- [~dwysakowicz] Thanks for your suggestion. It really makes sense to put the converting {{List}} to string in {{convertToString}}. Also i find that not only the {{List}}, also the {{Map}} could not be parsed correctly. Other complex types {{Enum}}, {{Duration}}, {{MemorySize}} work well. > List config option could not be parsed correctly > > > Key: FLINK-16013 > URL: https://issues.apache.org/jira/browse/FLINK-16013 > Project: Flink > Issue Type: Bug > Components: Runtime / Configuration >Affects Versions: 1.10.0 >Reporter: Yang Wang >Assignee: Yang Wang >Priority: Major > Labels: pull-request-available > Fix For: 1.10.1, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > Currently, if a config option is {{List}} type and written to a > flink-conf.yaml, it could not be parsed correctly when reloaded from yaml > resource. The root cause is we use {{List#toString}} to save into the yaml > resource. However, when we want to parse a List from a string, we use > semicolon to split the value. > > The following is a unit test to reproduce this problem. > {code:java} > public void testWriteConfigurationAndReload() throws IOException { > final File flinkConfDir = temporaryFolder.newFolder().getAbsoluteFile(); > final Configuration flinkConfig = new Configuration(); > final ConfigOption> listConfigOption = ConfigOptions > .key("test-list-string-key") > .stringType() > .asList() > .noDefaultValue(); > final List values = Arrays.asList("value1", "value2", "value3"); > flinkConfig.set(listConfigOption, values); > assertThat(values, > Matchers.containsInAnyOrder(flinkConfig.get(listConfigOption).toArray())); > BootstrapTools.writeConfiguration(flinkConfig, new File(flinkConfDir, > "flink-conf.yaml")); > final Configuration loadedFlinkConfig = > GlobalConfiguration.loadConfiguration(flinkConfDir.getAbsolutePath()); > assertThat(values, > Matchers.containsInAnyOrder(loadedFlinkConfig.get(listConfigOption).toArray())); > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
flinkbot commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#issuecomment-585541734 ## CI report: * e2fc04377969fe59ba6de22e47d93050ead27d6a UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming
flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming URL: https://github.com/apache/flink/pull/11069#issuecomment-585181372 ## CI report: * 059e816c9f5973563a4f03e4c6af2e4af4729b66 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148567365) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5105) * 3cf0448fa909468f6c2d5508fe527be436ef475f Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148715451) * 578e1424988f95f022c3fbd558b54d5fe8ffe84c UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11047: [FLINK-15912][table] Add Context to TableSourceFactory and TableSinkFactory
flinkbot edited a comment on issue #11047: [FLINK-15912][table] Add Context to TableSourceFactory and TableSinkFactory URL: https://github.com/apache/flink/pull/11047#issuecomment-584007380 ## CI report: * 86c4939042e8af2da1ac1e5900225f4f0310fa04 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148149333) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4991) * 18beb9a3c1ed78a6cbc1bc5ba9f96b51bbf8eeea Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148181501) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5000) * a9125de0859c43262d41c4cecee53ceeb807cf27 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148340324) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5041) * 52bc7dcf579ef6e5e466ec93451cbf56451b1d41 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148547890) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5093) * a307994428657e1025786975b7cf11544afa3ab8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148567327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5104) * 4749ac089576b36668e412acdbe56f55b4ffbdc6 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148576458) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5106) * 8c7c673b8bf46297d499db5725c127986533c8ff Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148711047) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5119) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x.
flinkbot commented on issue #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x. URL: https://github.com/apache/flink/pull/11077#issuecomment-585538949 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit 4ac010892eab3b0a44397b03aff0562c1fe407df (Thu Feb 13 04:01:05 UTC 2020) **Warnings:** * **1 pom.xml files were touched**: Check for build and licensing issues. * No documentation files were touched! Remember to keep the Flink docs up to date! Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files
flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files URL: https://github.com/apache/flink/pull/7702#issuecomment-572195960 ## CI report: * 72dd07f5f10a56adf6025e82083af21ada47c711 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/143614040) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4198) * f2387288cb33f288164ed9d102b47868a93dc898 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148710964) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5118) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15909) Add PyPI release process into the subsequent release of 1.9.x
[ https://issues.apache.org/jira/browse/FLINK-15909?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-15909: --- Labels: pull-request-available (was: ) > Add PyPI release process into the subsequent release of 1.9.x > -- > > Key: FLINK-15909 > URL: https://issues.apache.org/jira/browse/FLINK-15909 > Project: Flink > Issue Type: Bug > Components: Build System >Reporter: sunjincheng >Assignee: Wei Zhong >Priority: Major > Labels: pull-request-available > Fix For: 1.9.3 > > > Add PyPI release process into the subsequent release of 1.9.x. i.e., improve > the script of `create-binary-release. sh`. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] WeiZhong94 opened a new pull request #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x.
WeiZhong94 opened a new pull request #11077: [FLINK-15909][build] Add PyPI release process into the subsequent release of 1.9.x. URL: https://github.com/apache/flink/pull/11077 ## What is the purpose of the change *This pull request adds PyPI release process into the subsequent release of 1.9.x.* ## Brief change log - *Add configuration in pom.xml to clean old files.* - *Add `make_python_release` function in `create_binary_release.sh`* - *Exclude unnecessary files when creating source release.* ## Verifying this change This change is a trivial rework / code cleanup without any test coverage. ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): (no) - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (no) - The serializers: (no) - The runtime per-record code paths (performance sensitive): (no) - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no) - The S3 file system connector: (no) ## Documentation - Does this pull request introduce a new feature? (no) - If yes, how is the feature documented? (not applicable) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming
flinkbot edited a comment on issue #11069: [hotfix] Fix some typos in flink-examples-streaming URL: https://github.com/apache/flink/pull/11069#issuecomment-585181372 ## CI report: * 059e816c9f5973563a4f03e4c6af2e4af4729b66 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148567365) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5105) * 3cf0448fa909468f6c2d5508fe527be436ef475f UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
flinkbot commented on issue #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076#issuecomment-585533515 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit e2fc04377969fe59ba6de22e47d93050ead27d6a (Thu Feb 13 03:34:13 UTC 2020) ✅no warnings Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] WeiZhong94 opened a new pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents.
WeiZhong94 opened a new pull request #11076: [FLINK-15908][docs] Add description of support 'pip install' to 1.9.x documents. URL: https://github.com/apache/flink/pull/11076 ## What is the purpose of the change *This pull request adds description of support 'pip install' to 1.9.x documents.* ## Brief change log - *adds description of support 'pip install' to `python_table_api.md`.* - *adds description of support 'pip install' to `python_table_api.zh.md`.* ## Verifying this change This change is a trivial rework / code cleanup without any test coverage. ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): (no) - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (no) - The serializers: (no) - The runtime per-record code paths (performance sensitive): (no) - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no) - The S3 file system connector: (no) ## Documentation - Does this pull request introduce a new feature? (no) - If yes, how is the feature documented? (docs) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15908) Add description of support 'pip install' to 1.9.x documents
[ https://issues.apache.org/jira/browse/FLINK-15908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-15908: --- Labels: pull-request-available (was: ) > Add description of support 'pip install' to 1.9.x documents > --- > > Key: FLINK-15908 > URL: https://issues.apache.org/jira/browse/FLINK-15908 > Project: Flink > Issue Type: Bug > Components: Documentation >Reporter: sunjincheng >Assignee: Wei Zhong >Priority: Major > Labels: pull-request-available > Fix For: 1.9.3 > > > Add Description of support 'pip install' to 1.9.x documents. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #11047: [FLINK-15912][table] Add Context to TableSourceFactory and TableSinkFactory
flinkbot edited a comment on issue #11047: [FLINK-15912][table] Add Context to TableSourceFactory and TableSinkFactory URL: https://github.com/apache/flink/pull/11047#issuecomment-584007380 ## CI report: * 86c4939042e8af2da1ac1e5900225f4f0310fa04 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148149333) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4991) * 18beb9a3c1ed78a6cbc1bc5ba9f96b51bbf8eeea Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148181501) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5000) * a9125de0859c43262d41c4cecee53ceeb807cf27 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148340324) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5041) * 52bc7dcf579ef6e5e466ec93451cbf56451b1d41 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148547890) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5093) * a307994428657e1025786975b7cf11544afa3ab8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148567327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5104) * 4749ac089576b36668e412acdbe56f55b4ffbdc6 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148576458) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5106) * 8c7c673b8bf46297d499db5725c127986533c8ff Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148711047) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5119) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #7368: [FLINK-10742][network] Let Netty use Flink's buffers directly in credit-based mode
flinkbot edited a comment on issue #7368: [FLINK-10742][network] Let Netty use Flink's buffers directly in credit-based mode URL: https://github.com/apache/flink/pull/7368#issuecomment-567435407 ## CI report: * 6b12b52b99894864db993a2fa8ab2bfcce0edd5c Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141736780) * 25fc5608d0d7143f4384a7648054e9c99ebb32e9 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/145174867) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4502) * 0bd82aa136c43af8fdfde0f2a64284323c40c999 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148123283) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4982) * 99f9c6528f53574829368b52de5fa6c3f0a41d44 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148132166) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4986) * e61197d1bb5a61a55051b193a01543edfa8a22b6 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148706320) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5117) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files
flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files URL: https://github.com/apache/flink/pull/7702#issuecomment-572195960 ## CI report: * 72dd07f5f10a56adf6025e82083af21ada47c711 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/143614040) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4198) * f2387288cb33f288164ed9d102b47868a93dc898 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148710964) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5118) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #11047: [FLINK-15912][table] Add Context to TableSourceFactory and TableSinkFactory
flinkbot edited a comment on issue #11047: [FLINK-15912][table] Add Context to TableSourceFactory and TableSinkFactory URL: https://github.com/apache/flink/pull/11047#issuecomment-584007380 ## CI report: * 86c4939042e8af2da1ac1e5900225f4f0310fa04 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148149333) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4991) * 18beb9a3c1ed78a6cbc1bc5ba9f96b51bbf8eeea Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148181501) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5000) * a9125de0859c43262d41c4cecee53ceeb807cf27 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148340324) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5041) * 52bc7dcf579ef6e5e466ec93451cbf56451b1d41 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148547890) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5093) * a307994428657e1025786975b7cf11544afa3ab8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148567327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5104) * 4749ac089576b36668e412acdbe56f55b4ffbdc6 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148576458) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5106) * 8c7c673b8bf46297d499db5725c127986533c8ff UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files
flinkbot edited a comment on issue #7702: [FLINK-11088][Security][YARN] Allow YARN to discover pre-installed keytab files URL: https://github.com/apache/flink/pull/7702#issuecomment-572195960 ## CI report: * 72dd07f5f10a56adf6025e82083af21ada47c711 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/143614040) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=4198) * f2387288cb33f288164ed9d102b47868a93dc898 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] dianfu commented on a change in pull request #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value
dianfu commented on a change in pull request #11059: [FLINK-15970][python] Optimize the Python UDF execution to only serialize the value URL: https://github.com/apache/flink/pull/11059#discussion_r378617961 ## File path: flink-python/src/main/java/org/apache/flink/table/runtime/runners/python/AbstractPythonStatelessFunctionRunner.java ## @@ -161,6 +170,20 @@ public ExecutableStage createExecutableStage() throws Exception { return builder.build(); } + private RunnerApi.WireCoderSetting createWireCoderSetting() throws IOException { Review comment: What about rename createWireCoderSetting to createValueOnlyWireCoderSetting This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Comment Edited] (FLINK-15948) Resource will be wasted when the task manager memory is not a multiple of Yarn minimum allocation
[ https://issues.apache.org/jira/browse/FLINK-15948?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17035181#comment-17035181 ] Xintong Song edited comment on FLINK-15948 at 2/13/20 2:07 AM: --- 1. I just checked on this. -Turns out we already have such checking and warning logs on client side.- We have checking on whether process memory is smaller than yarn min allocation, but not on whether it's an exact division. See {{YarnClusterDescriptor#validateClusterResources}}. 2. AFAIK, it is proposed in [FLIP-75|https://docs.google.com/document/d/1tIa8yN2prWWKJI_fa1u0t6h1r6RJpp56m48pXEyh6iI/edit?usp=sharing] to add metrics for task executor total process memory size, which should be the container size on Kubernetes. It seems not necessary to add another metric only for Yarn where the container size could be larger than the process memory size, especially when we already have the warning log. Not sure whether container size is guaranteed to be exactly as much as requested on Mesos though. was (Author: xintongsong): 1. I just checked on this. Turns out we already have such checking and warning logs on client side. See {{ YarnClusterDescriptor#validateClusterResources }}. 2. AFAIK, it is proposed in [FLIP-75|https://docs.google.com/document/d/1tIa8yN2prWWKJI_fa1u0t6h1r6RJpp56m48pXEyh6iI/edit?usp=sharing] to add metrics for task executor total process memory size, which should be the container size on Kubernetes. It seems not necessary to add another metric only for Yarn where the container size could be larger than the process memory size, especially when we already have the warning log. Not sure whether container size is guaranteed to be exactly as much as requested on Mesos though. > Resource will be wasted when the task manager memory is not a multiple of > Yarn minimum allocation > - > > Key: FLINK-15948 > URL: https://issues.apache.org/jira/browse/FLINK-15948 > Project: Flink > Issue Type: Bug > Components: Deployment / YARN >Affects Versions: 1.10.0 >Reporter: Yang Wang >Priority: Major > > If the {{taskmanager.memory.process.size}} is set to 2000m and the Yarn > minimum allocation is 128m, we will get a container with 2048m. Currently, > {{TaskExecutorProcessSpec}} is built with 2000m, so we will have 48m wasted > and they could not be used by Flink. > I think Flink has accounted all the jvm heap, off-heap, overhead resources. > So we should not leave these free memory there. And i suggest to update the > {{TaskExecutorProcessSpec}} according to the Yarn allocated container. -- This message was sent by Atlassian Jira (v8.3.4#803005)