[
https://issues.apache.org/jira/browse/FLINK-21212?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Flink Jira Bot updated FLINK-21212:
-----------------------------------
Labels: auto-deprioritized-major auto-deprioritized-minor (was:
auto-deprioritized-major stale-minor)
Priority: Not a Priority (was: Minor)
This issue was labeled "stale-minor" 7 days ago and has not received any
updates so it is being deprioritized. If this ticket is actually Minor, please
raise the priority and ask a committer to assign you the issue or revive the
public discussion.
> Can no longer cast INT to DATE
> ------------------------------
>
> Key: FLINK-21212
> URL: https://issues.apache.org/jira/browse/FLINK-21212
> Project: Flink
> Issue Type: Improvement
> Components: API / Type Serialization System
> Affects Versions: 1.12.1
> Environment: EMR 6.1
> Flink 1.12.1
> Reporter: Rex Remind
> Priority: Not a Priority
> Labels: auto-deprioritized-major, auto-deprioritized-minor
>
> I upgraded from 1.11.3 to 1.12.1 and can no longer cast int to date. Data is
> arriving from Debezium. We also went from json to avro but I'd think that
> would be unrelated.
> Example:
> {code:java}
> .addOrReplaceColumns(
> $"date".cast(Types.SQL_DATE()) as "date"
> ) {code}
>
> Result:
> {code:java}
> org.apache.flink.client.program.ProgramInvocationException: The main method
> caused an error: Invalid function call:
> cast(INT, DATE)
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:360)
> at
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:213)
> at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
> at
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:816)
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:248)
> at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1058)
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1136)
> at java.base/java.security.AccessController.doPrivileged(Native Method)
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
> at
> org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1136)
> Caused by: org.apache.flink.table.api.ValidationException: Invalid function
> call:
> cast(INT, DATE)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.createInvalidCallException(TypeInferenceUtil.java:190)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.runTypeInference(TypeInferenceUtil.java:87)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule$ResolvingCallVisitor.runTypeInference(ResolveCallByArgumentsRule.java:222)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule$ResolvingCallVisitor.lambda$visit$1(ResolveCallByArgumentsRule.java:149)
> at java.base/java.util.Optional.map(Optional.java:265)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule$ResolvingCallVisitor.visit(ResolveCallByArgumentsRule.java:147)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule$ResolvingCallVisitor.visit(ResolveCallByArgumentsRule.java:91)
> at
> org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:37)
> at
> org.apache.flink.table.expressions.UnresolvedCallExpression.accept(UnresolvedCallExpression.java:128)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule$ResolvingCallVisitor.visit(ResolveCallByArgumentsRule.java:138)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule$ResolvingCallVisitor.visit(ResolveCallByArgumentsRule.java:91)
> at
> org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:37)
> at
> org.apache.flink.table.expressions.UnresolvedCallExpression.accept(UnresolvedCallExpression.java:128)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule.lambda$apply$0(ResolveCallByArgumentsRule.java:85)
> at
> java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:271)
> at
> java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
> at
> java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
> at
> java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
> at
> java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
> at
> java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
> at
> java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
> at
> org.apache.flink.table.expressions.resolver.rules.ResolveCallByArgumentsRule.apply(ResolveCallByArgumentsRule.java:86)
> at
> org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:212)
> at java.base/java.util.function.Function.lambda$andThen$1(Function.java:88)
> at
> org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:175)
> at
> org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191)
> at
> org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:163)
> at
> org.apache.flink.table.operations.utils.OperationTreeBuilder.addColumns(OperationTreeBuilder.java:208)
> at
> org.apache.flink.table.api.internal.TableImpl.addColumnsOperation(TableImpl.java:483)
> at
> org.apache.flink.table.api.internal.TableImpl.addOrReplaceColumns(TableImpl.java:465)
> at
> com.remind.graph.plans.people_compacted.PeopleCompactedJobScala.executePlan(PeopleCompactedJobScala.scala:283)
> at com.remind.graph.plans.FlinkJobBase.run(FlinkJobBase.java:96)
> at
> com.remind.graph.plans.people_compacted.PeopleCompactedJobScala$.main(PeopleCompactedJobScala.scala:87)
> at
> com.remind.graph.plans.people_compacted.PeopleCompactedJobScala.main(PeopleCompactedJobScala.scala)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:343)
> ... 11 more
> Caused by: org.apache.flink.table.api.ValidationException: Invalid input
> arguments. Expected signatures are:
> cast(<ANY>, <TYPE LITERAL>)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.createInvalidInputException(TypeInferenceUtil.java:177)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.runTypeInferenceInternal(TypeInferenceUtil.java:333)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.runTypeInference(TypeInferenceUtil.java:85)
> ... 48 more
> Caused by: org.apache.flink.table.api.ValidationException: Unsupported cast
> from 'INT' to 'DATE'.
> at
> org.apache.flink.table.types.inference.CallContext.newValidationError(CallContext.java:93)
> at
> org.apache.flink.table.types.inference.strategies.CastInputTypeStrategy.inferInputTypes(CastInputTypeStrategy.java:72)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.inferInputTypes(TypeInferenceUtil.java:436)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.adaptArguments(TypeInferenceUtil.java:124)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.adaptArguments(TypeInferenceUtil.java:101)
> at
> org.apache.flink.table.types.inference.TypeInferenceUtil.runTypeInferenceInternal(TypeInferenceUtil.java:331)
> ... 49 more {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)