[jira] [Updated] (FLINK-27997) How to unregister custom metrics at runtime?
[ https://issues.apache.org/jira/browse/FLINK-27997?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf updated FLINK-27997: - Description: How to custom unregister metrics? Worry about memory overflow due to too many metrics. Any suggestions for adding a feature to delete metrics? was: How to custom unregister metrics? Worry about memory overflow due to too many metrics. > How to unregister custom metrics at runtime? > > > Key: FLINK-27997 > URL: https://issues.apache.org/jira/browse/FLINK-27997 > Project: Flink > Issue Type: Improvement > Components: Runtime / Metrics >Reporter: henvealf >Priority: Major > > How to custom unregister metrics? > Worry about memory overflow due to too many metrics. > Any suggestions for adding a feature to delete metrics? -- This message was sent by Atlassian Jira (v8.20.7#820007)
[jira] [Updated] (FLINK-27997) How to unregister custom metrics at runtime?
[ https://issues.apache.org/jira/browse/FLINK-27997?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf updated FLINK-27997: - Description: How to custom unregister metrics? Worry about memory overflow due to too many metrics. Any suggestions for adding a feature to unregister metrics? was: How to custom unregister metrics? Worry about memory overflow due to too many metrics. Any suggestions for adding a feature to delete metrics? > How to unregister custom metrics at runtime? > > > Key: FLINK-27997 > URL: https://issues.apache.org/jira/browse/FLINK-27997 > Project: Flink > Issue Type: Improvement > Components: Runtime / Metrics >Reporter: henvealf >Priority: Major > > How to custom unregister metrics? > Worry about memory overflow due to too many metrics. > Any suggestions for adding a feature to unregister metrics? -- This message was sent by Atlassian Jira (v8.20.7#820007)
[jira] [Created] (FLINK-27997) How to unregister custom metrics at runtime?
henvealf created FLINK-27997: Summary: How to unregister custom metrics at runtime? Key: FLINK-27997 URL: https://issues.apache.org/jira/browse/FLINK-27997 Project: Flink Issue Type: Improvement Components: Runtime / Metrics Reporter: henvealf How to custom unregister metrics? Worry about memory overflow due to too many metrics. -- This message was sent by Atlassian Jira (v8.20.7#820007)
[jira] [Closed] (FLINK-19242) org.apache.flink.table.api.ValidationException: Cannot resolve field
[ https://issues.apache.org/jira/browse/FLINK-19242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf closed FLINK-19242. Resolution: Fixed My problem. It worked after Changed the following code {code:java} val countResult = eventTable .select($"name", $"product", $"id", $"_event_time") .window(Tumble over 10.second() on $"_event_time" as "w") .groupBy( $"id", $"w") .select( $"id".count() as("c"), $"id", $"w".end() ){code} change $"w" to $"w".end() in select > org.apache.flink.table.api.ValidationException: Cannot resolve field > > > Key: FLINK-19242 > URL: https://issues.apache.org/jira/browse/FLINK-19242 > Project: Flink > Issue Type: Bug > Components: Table SQL / API >Affects Versions: 1.11.1 >Reporter: henvealf >Priority: Major > > Hello, > Planner: Blink > The Code: > {code:java} > val countResult = eventTable > .select($"name", $"product", $"id", $"_event_time") > .window(Tumble over 10.second() on $"_event_time" as "w") > .groupBy( $"id", $"w") > .select( > $"id".count() as("c"), $"id", $"w" > ) > {code} > Exception: > {code:java} > Exception in thread "main" org.apache.flink.table.api.ValidationException: > Cannot resolve field [w], input field list:[id, EXPR$0].Exception in thread > "main" org.apache.flink.table.api.ValidationException: Cannot resolve field > [w], input field list:[id, EXPR$0]. at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75) > at java.util.Optional.orElseThrow(Optional.java:290) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75) > at java.util.Optional.orElseGet(Optional.java:267) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74) > at java.util.Optional.orElseGet(Optional.java:267) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:71) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51) > at > org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:31) > at > org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:60) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47) > at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) > at > java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) > at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at > java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) > at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) > at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at > java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48) > at > org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:211) > at java.util.function.Function.lambda$andThen$1(Function.java:88) at > java.util.function.Function.lambda$andThen$1(Function.java:88) at > java.util.function.Function.lambda$andThen$1(Function.java:88) at > org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:178) > at > org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191) > at > org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:160) > at > org.apache.flink.table.api.internal.TableImpl$WindowGroupedTableImpl.select(TableImpl.java:792) > at > ... > Process finished with exit code 1 > {code} > Why? > Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Reopened] (FLINK-19242) org.apache.flink.table.api.ValidationException: Cannot resolve field
[ https://issues.apache.org/jira/browse/FLINK-19242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf reopened FLINK-19242: -- > org.apache.flink.table.api.ValidationException: Cannot resolve field > > > Key: FLINK-19242 > URL: https://issues.apache.org/jira/browse/FLINK-19242 > Project: Flink > Issue Type: Bug > Components: Table SQL / API >Affects Versions: 1.11.1 >Reporter: henvealf >Priority: Major > > Hello, > Planner: Blink > The Code: > {code:java} > val countResult = eventTable > .select($"name", $"product", $"id", $"_event_time") > .window(Tumble over 10.second() on $"_event_time" as "w") > .groupBy( $"id", $"w") > .select( > $"id".count() as("c"), $"id", $"w" > ) > {code} > Exception: > {code:java} > Exception in thread "main" org.apache.flink.table.api.ValidationException: > Cannot resolve field [w], input field list:[id, EXPR$0].Exception in thread > "main" org.apache.flink.table.api.ValidationException: Cannot resolve field > [w], input field list:[id, EXPR$0]. at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75) > at java.util.Optional.orElseThrow(Optional.java:290) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75) > at java.util.Optional.orElseGet(Optional.java:267) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74) > at java.util.Optional.orElseGet(Optional.java:267) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:71) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51) > at > org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:31) > at > org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:60) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47) > at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) > at > java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) > at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at > java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) > at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) > at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at > java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48) > at > org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:211) > at java.util.function.Function.lambda$andThen$1(Function.java:88) at > java.util.function.Function.lambda$andThen$1(Function.java:88) at > java.util.function.Function.lambda$andThen$1(Function.java:88) at > org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:178) > at > org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191) > at > org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:160) > at > org.apache.flink.table.api.internal.TableImpl$WindowGroupedTableImpl.select(TableImpl.java:792) > at > ... > Process finished with exit code 1 > {code} > Why? > Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (FLINK-19242) org.apache.flink.table.api.ValidationException: Cannot resolve field
[ https://issues.apache.org/jira/browse/FLINK-19242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf closed FLINK-19242. Resolution: Fixed > org.apache.flink.table.api.ValidationException: Cannot resolve field > > > Key: FLINK-19242 > URL: https://issues.apache.org/jira/browse/FLINK-19242 > Project: Flink > Issue Type: Bug > Components: Table SQL / API >Affects Versions: 1.11.1 >Reporter: henvealf >Priority: Major > > Hello, > Planner: Blink > The Code: > {code:java} > val countResult = eventTable > .select($"name", $"product", $"id", $"_event_time") > .window(Tumble over 10.second() on $"_event_time" as "w") > .groupBy( $"id", $"w") > .select( > $"id".count() as("c"), $"id", $"w" > ) > {code} > Exception: > {code:java} > Exception in thread "main" org.apache.flink.table.api.ValidationException: > Cannot resolve field [w], input field list:[id, EXPR$0].Exception in thread > "main" org.apache.flink.table.api.ValidationException: Cannot resolve field > [w], input field list:[id, EXPR$0]. at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75) > at java.util.Optional.orElseThrow(Optional.java:290) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75) > at java.util.Optional.orElseGet(Optional.java:267) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74) > at java.util.Optional.orElseGet(Optional.java:267) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:71) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51) > at > org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:31) > at > org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:60) > at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47) > at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) > at > java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) > at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at > java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) > at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) > at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at > java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) at > org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48) > at > org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:211) > at java.util.function.Function.lambda$andThen$1(Function.java:88) at > java.util.function.Function.lambda$andThen$1(Function.java:88) at > java.util.function.Function.lambda$andThen$1(Function.java:88) at > org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:178) > at > org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191) > at > org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:160) > at > org.apache.flink.table.api.internal.TableImpl$WindowGroupedTableImpl.select(TableImpl.java:792) > at > ... > Process finished with exit code 1 > {code} > Why? > Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-19242) org.apache.flink.table.api.ValidationException: Cannot resolve field
[ https://issues.apache.org/jira/browse/FLINK-19242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf updated FLINK-19242: - Description: Hello, Planner: Blink The Code: {code:java} val countResult = eventTable .select($"name", $"product", $"id", $"_event_time") .window(Tumble over 10.second() on $"_event_time" as "w") .groupBy( $"id", $"w") .select( $"id".count() as("c"), $"id", $"w" ) {code} Exception: {code:java} Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [w], input field list:[id, EXPR$0].Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [w], input field list:[id, EXPR$0]. at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75) at java.util.Optional.orElseThrow(Optional.java:290) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75) at java.util.Optional.orElseGet(Optional.java:267) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74) at java.util.Optional.orElseGet(Optional.java:267) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:71) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51) at org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:31) at org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:60) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48) at org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:211) at java.util.function.Function.lambda$andThen$1(Function.java:88) at java.util.function.Function.lambda$andThen$1(Function.java:88) at java.util.function.Function.lambda$andThen$1(Function.java:88) at org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:178) at org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191) at org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:160) at org.apache.flink.table.api.internal.TableImpl$WindowGroupedTableImpl.select(TableImpl.java:792) at ... Process finished with exit code 1 {code} Why? Thanks! was: Hello, The Code: {code:java} val countResult = eventTable .select($"name", $"product", $"id", $"_event_time") .window(Tumble over 10.second() on $"_event_time" as "w") .groupBy( $"id", $"w") .select( $"id".count() as("c"), $"id", $"w" ) {code} Exception: {code:java} Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [w], input field list:[id, EXPR$0].Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [w], input field list:[id, EXPR$0]. at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75) at java.util.Optional.orElseThrow(Optional.java:290) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75) at java.util.Optional.orElseGet(Optional.java:267) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74) at java.util.Optional.orElseGet(Optional.java:267) at
[jira] [Created] (FLINK-19242) org.apache.flink.table.api.ValidationException: Cannot resolve field
henvealf created FLINK-19242: Summary: org.apache.flink.table.api.ValidationException: Cannot resolve field Key: FLINK-19242 URL: https://issues.apache.org/jira/browse/FLINK-19242 Project: Flink Issue Type: Bug Components: Table SQL / API Affects Versions: 1.11.1 Reporter: henvealf Hello, The Code: {code:java} val countResult = eventTable .select($"name", $"product", $"id", $"_event_time") .window(Tumble over 10.second() on $"_event_time" as "w") .groupBy( $"id", $"w") .select( $"id".count() as("c"), $"id", $"w" ) {code} Exception: {code:java} Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [w], input field list:[id, EXPR$0].Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [w], input field list:[id, EXPR$0]. at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75) at java.util.Optional.orElseThrow(Optional.java:290) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75) at java.util.Optional.orElseGet(Optional.java:267) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74) at java.util.Optional.orElseGet(Optional.java:267) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:71) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51) at org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:31) at org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:60) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48) at org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:211) at java.util.function.Function.lambda$andThen$1(Function.java:88) at java.util.function.Function.lambda$andThen$1(Function.java:88) at java.util.function.Function.lambda$andThen$1(Function.java:88) at org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:178) at org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191) at org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:160) at org.apache.flink.table.api.internal.TableImpl$WindowGroupedTableImpl.select(TableImpl.java:792) at ... Process finished with exit code 1 {code} Why? Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (FLINK-19060) Checkpoint not triggered when use broadcast stream
[ https://issues.apache.org/jira/browse/FLINK-19060?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf closed FLINK-19060. Resolution: Not A Bug > Checkpoint not triggered when use broadcast stream > -- > > Key: FLINK-19060 > URL: https://issues.apache.org/jira/browse/FLINK-19060 > Project: Flink > Issue Type: Bug > Components: API / DataStream >Affects Versions: 1.11.1 >Reporter: henvealf >Priority: Major > Attachments: image-2020-08-27-16-41-23-699.png, > image-2020-08-27-16-44-37-442.png, image-2020-08-27-16-45-28-134.png, > image-2020-08-27-16-51-10-512.png > > Original Estimate: 1h > Remaining Estimate: 1h > > Code: > !image-2020-08-27-16-51-10-512.png! > KafkaSourceConfig: > consumer.setStartFromGroupOffsets() > Web UI: > !image-2020-08-27-16-45-28-134.png! > Checkpoint always doesn't happen. Did I write something wrong? > Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-19060) Checkpoint not triggered when use broadcast stream
[ https://issues.apache.org/jira/browse/FLINK-19060?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17185800#comment-17185800 ] henvealf commented on FLINK-19060: -- [~yunta] [~aljoscha] ,Hi,The output of the relevant log in jobmanager is "Checkpoint triggering task Source: Collection Source (1/1) of job 8b4fc3ae2a6d5902a5506970c797eb9f is not in state RUNNING but FINISHED instead. Aborting checkpoint." Thanks. The situation is the same as [~yunta] described. I already understand. It would be better if there is a relevant description in the official document. > Checkpoint not triggered when use broadcast stream > -- > > Key: FLINK-19060 > URL: https://issues.apache.org/jira/browse/FLINK-19060 > Project: Flink > Issue Type: Bug > Components: API / DataStream >Affects Versions: 1.11.1 >Reporter: henvealf >Priority: Major > Attachments: image-2020-08-27-16-41-23-699.png, > image-2020-08-27-16-44-37-442.png, image-2020-08-27-16-45-28-134.png, > image-2020-08-27-16-51-10-512.png > > Original Estimate: 1h > Remaining Estimate: 1h > > Code: > !image-2020-08-27-16-51-10-512.png! > KafkaSourceConfig: > consumer.setStartFromGroupOffsets() > Web UI: > !image-2020-08-27-16-45-28-134.png! > Checkpoint always doesn't happen. Did I write something wrong? > Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-19060) Checkpoint not triggered when use broadcast stream
[ https://issues.apache.org/jira/browse/FLINK-19060?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf updated FLINK-19060: - Attachment: image-2020-08-27-16-51-10-512.png Description: Code: !image-2020-08-27-16-51-10-512.png! KafkaSourceConfig: consumer.setStartFromGroupOffsets() Web UI: !image-2020-08-27-16-45-28-134.png! Checkpoint always doesn't happen. Did I write something wrong? Thanks! was: Code: !image-2020-08-27-16-43-30-536.png! KafkaSourceConfig: consumer.setStartFromGroupOffsets() Web UI: !image-2020-08-27-16-45-28-134.png! Checkpoint always doesn't happen. Did I write something wrong? Thanks! > Checkpoint not triggered when use broadcast stream > -- > > Key: FLINK-19060 > URL: https://issues.apache.org/jira/browse/FLINK-19060 > Project: Flink > Issue Type: Bug > Components: API / DataStream >Affects Versions: 1.11.1 >Reporter: henvealf >Priority: Major > Attachments: image-2020-08-27-16-41-23-699.png, > image-2020-08-27-16-44-37-442.png, image-2020-08-27-16-45-28-134.png, > image-2020-08-27-16-51-10-512.png > > Original Estimate: 1h > Remaining Estimate: 1h > > Code: > !image-2020-08-27-16-51-10-512.png! > KafkaSourceConfig: > consumer.setStartFromGroupOffsets() > Web UI: > !image-2020-08-27-16-45-28-134.png! > Checkpoint always doesn't happen. Did I write something wrong? > Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (FLINK-19060) Checkpoint not triggered when use broadcast stream
henvealf created FLINK-19060: Summary: Checkpoint not triggered when use broadcast stream Key: FLINK-19060 URL: https://issues.apache.org/jira/browse/FLINK-19060 Project: Flink Issue Type: Bug Components: API / DataStream Affects Versions: 1.11.1 Reporter: henvealf Attachments: image-2020-08-27-16-41-23-699.png, image-2020-08-27-16-44-37-442.png, image-2020-08-27-16-45-28-134.png Code: !image-2020-08-27-16-43-30-536.png! KafkaSourceConfig: consumer.setStartFromGroupOffsets() Web UI: !image-2020-08-27-16-45-28-134.png! Checkpoint always doesn't happen. Did I write something wrong? Thanks! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-18599) Compile error when use windowAll and TumblingProcessingTimeWindows
[ https://issues.apache.org/jira/browse/FLINK-18599?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] henvealf updated FLINK-18599: - Description: Code: {code:java} import org.apache.commons.lang3.StringUtils import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment import org.apache.flink.streaming.api.windowing.assigners.{TumblingProcessingTimeWindows} import org.apache.flink.streaming.api.windowing.time.Time import org.apache.flink.streaming.api.scala._ val env = StreamExecutionEnvironment.getExecutionEnvironment val stream = env.fromElements("a", "b", "c") stream .filter((str: String) => StringUtils.isNotEmpty(str)) .map( _ => 1) .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) .reduce((a1, a2) => a1 + a2) .print() {code} Compile failed: {code:java} error: type mismatch; found : org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows required: org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[_ >: Int, ?] Note: Object <: Any (and org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows <: org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[Object,org.apache.flink.streaming.api.windowing.windows.TimeWindow]), but Java-defined class WindowAssigner is invariant in type T. You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10) .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) ^ one error found {code} What went wrong? Scala version: 2.11 Flink version: 1.11 Thanks. was: Code: {code:java} import org.apache.commons.lang3.StringUtils import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment import org.apache.flink.streaming.api.windowing.assigners.{TumblingProcessingTimeWindows} import org.apache.flink.streaming.api.windowing.time.Time import org.apache.flink.streaming.api.scala._ val env = StreamExecutionEnvironment.getExecutionEnvironment val stream = env.fromElements("a", "b", "c") stream .filter((str: String) => StringUtils.isNotEmpty(str)) .map( _ => 1) .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) .reduce((a1, a2) => a1 + a2) .print() {code} Compile failed: {code:java} error: type mismatch; found : org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows required: org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[_ >: Int, ?] Note: Object <: Any (and org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows <: org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[Object,org.apache.flink.streaming.api.windowing.windows.TimeWindow]), but Java-defined class WindowAssigner is invariant in type T. You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10) .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) ^ one error found {code} What went wrong? Thanks. > Compile error when use windowAll and TumblingProcessingTimeWindows > -- > > Key: FLINK-18599 > URL: https://issues.apache.org/jira/browse/FLINK-18599 > Project: Flink > Issue Type: Bug > Components: API / DataStream >Affects Versions: 1.11.0 >Reporter: henvealf >Priority: Major > > Code: > {code:java} > import org.apache.commons.lang3.StringUtils > import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment > import > org.apache.flink.streaming.api.windowing.assigners.{TumblingProcessingTimeWindows} > import org.apache.flink.streaming.api.windowing.time.Time > import org.apache.flink.streaming.api.scala._ > > val env = StreamExecutionEnvironment.getExecutionEnvironment > val stream = env.fromElements("a", "b", "c") > stream > .filter((str: String) => StringUtils.isNotEmpty(str)) > .map( _ => 1) > .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) > .reduce((a1, a2) => a1 + a2) > .print() > {code} > Compile failed: > {code:java} > error: type mismatch; > found : > org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows > required: > org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[_ >: Int, ?] > Note: Object <: Any (and > org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows > <: > org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[Object,org.apache.flink.streaming.api.windowing.windows.TimeWindow]), > but Java-defined class WindowAssigner is invariant in type T. > You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10) >
[jira] [Created] (FLINK-18599) Compile error when use windowAll and TumblingProcessingTimeWindows
henvealf created FLINK-18599: Summary: Compile error when use windowAll and TumblingProcessingTimeWindows Key: FLINK-18599 URL: https://issues.apache.org/jira/browse/FLINK-18599 Project: Flink Issue Type: Bug Components: API / DataStream Affects Versions: 1.11.0 Reporter: henvealf Code: {code:java} import org.apache.commons.lang3.StringUtils import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment import org.apache.flink.streaming.api.windowing.assigners.{TumblingProcessingTimeWindows} import org.apache.flink.streaming.api.windowing.time.Time import org.apache.flink.streaming.api.scala._ val env = StreamExecutionEnvironment.getExecutionEnvironment val stream = env.fromElements("a", "b", "c") stream .filter((str: String) => StringUtils.isNotEmpty(str)) .map( _ => 1) .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) .reduce((a1, a2) => a1 + a2) .print() {code} Compile failed: {code:java} error: type mismatch; found : org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows required: org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[_ >: Int, ?] Note: Object <: Any (and org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows <: org.apache.flink.streaming.api.windowing.assigners.WindowAssigner[Object,org.apache.flink.streaming.api.windowing.windows.TimeWindow]), but Java-defined class WindowAssigner is invariant in type T. You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10) .windowAll(TumblingProcessingTimeWindows.of(Time.seconds(5))) ^ one error found {code} What went wrong? Thanks. -- This message was sent by Atlassian Jira (v8.3.4#803005)