对的,这是生产的一个 1.9 blink planner 作业,今天我测试一下 1.10 blink planner

Jark Wu <imj...@gmail.com> 于2020年3月18日周三 下午11:47写道:

> Hi zhisheng,
>
> 你用的是1.9吗? 试过 1.10.0 blink planner 么?
>
> On Wed, 18 Mar 2020 at 22:21, zhisheng <zhisheng2...@gmail.com> wrote:
>
> > hi, all
> >
> > 我定义的一个表的一个字段(yidun_score)是 numeric(5,2) 类型,写入 PostgreSQL 的 DDL
> yidun_score
> > 字段也是定义的 numeric(5,2) 类型,结果会报异常。
> >
> > org.apache.flink.client.program.ProgramInvocationException: The main
> method
> > caused an error: Field types of query result and registered TableSink
> > [Result] do not match.
> > Query result schema: [user_new_id: Long, total_credit_score: Integer,
> > total_order_count: Integer, loss_total_order_count: Integer, yidun_score:
> > BigDecimal, is_delete: Boolean]
> > TableSink schema:    [user_new_id: Long, total_credit_score: Integer,
> > total_order_count: Integer, loss_total_order_count: Integer, yidun_score:
> > BigDecimal, is_delete: Boolean]
> > at
> >
> >
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:593)
> > at
> >
> >
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:438)
> > at
> >
> >
> org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:83)
> > at
> >
> >
> org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:80)
> > at
> >
> >
> org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:122)
> > at
> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:227)
> > at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
> > at
> >
> >
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1010)
> > at
> >
> >
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1083)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:422)
> > at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
> > at
> >
> >
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> > at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1083)
> > Caused by: org.apache.flink.table.api.ValidationException: Field types of
> > query result and registered TableSink [Result] do not match.
> > Query result schema: [user_new_id: Long, total_credit_score: Integer,
> > total_order_count: Integer, loss_total_order_count: Integer, yidun_score:
> > BigDecimal, is_delete: Boolean]
> > TableSink schema:    [user_new_id: Long, total_credit_score: Integer,
> > total_order_count: Integer, loss_total_order_count: Integer, yidun_score:
> > BigDecimal, is_delete: Boolean]
> > at
> >
> >
> org.apache.flink.table.planner.sinks.TableSinkUtils$.validateSink(TableSinkUtils.scala:69)
> > at
> >
> >
> org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$2.apply(PlannerBase.scala:179)
> > at
> >
> >
> org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$2.apply(PlannerBase.scala:178)
> > at scala.Option.map(Option.scala:146)
> > at
> >
> >
> org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:178)
> > at
> >
> >
> org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:146)
> > at
> >
> >
> org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:146)
> > at
> >
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> > at
> >
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> > at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> >
> > 我后面把 numeric(5,2) 类型都改成 double,结果发现就不报异常了,可以正常将数据写入 PG,不知道这个 case 是不是一个
> > bug?
> >
>

回复