Hi,

本来应该使用Flink内置的函数的,但是目前由于有bug[1],使用到了HiveModule。
一个简单的办法是实现一个你的Module,继承自HiveModule,overwrite
getFunctionDefinition方法,REGEXP_REPLACE时返回empty。

[1]https://issues.apache.org/jira/browse/FLINK-15595

Best,
Jingsong Lee

On Fri, May 8, 2020 at 5:19 PM like <likeg...@163.com> wrote:

>
> 最初因为使用了hive中自定义的UDF,所以我注册了hive的catalog,后面又要用到hive的内置函数,所以又用了HiveModule,代码如下:
>
>
> val hive = new HiveCatalog("hive", "default", "/etc/hive_config", "1.2.1")
> tableEnv.registerCatalog("hive", hive)
> tableEnv.useCatalog("hive")
> tableEnv.useDatabase("default")
> tableEnv.loadModule("myhive", new HiveModule("1.2.1"))
>
>
>
>
> 在2020年5月8日 16:30,Jingsong Li<jingsongl...@gmail.com> 写道:
> Hi,
>
> 你是怎么用HiveModule的?还保留了CoreModule吗?
>
> Best,
> Jingsong Lee
>
> On Fri, May 8, 2020 at 4:14 PM like <likeg...@163.com> wrote:
>
> 各位大佬好,
> 目前我在使用HiveModule的过程中碰到了一些问题,在未使用HiveModule的时候用了REGEXP_REPLACE
> 函数,这是可以正常使用的,在使用HiveModule之后却出现了问题,当我把字符替换为空的时候就会报错
> REGEXP_REPLACE('abcd', 'a', ''),大家有碰到这个问题吗?或者我能选择使用flink或者hive的内置函数吗?
>
>
>
>
> 异常堆栈信息如下:
>
>
> org.apache.flink.client.program.ProgramInvocationException: The main
> method caused an error: SQL validation failed.
> java.lang.reflect.InvocationTargetException
> at
>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:335)
> at
>
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
> at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
> at
>
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:664)
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
> at
>
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:895)
> at
>
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:968)
> at
> org.apache.flink.client.cli.CliFrontend$$Lambda$38/1264413185.call(Unknown
> Source)
> at
>
> org.apache.flink.runtime.security.HadoopSecurityContext$$Lambda$39/1243806178.run(Unknown
> Source)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
> at
>
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:968)
> Caused by: org.apache.flink.table.api.ValidationException: SQL validation
> failed. java.lang.reflect.InvocationTargetException
> at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org
>
> $apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:130)
> at
>
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:105)
> at
>
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:127)
> at
>
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:66)
> at
>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
> at com.sui.bigdata.PlatformEngine$.$anonfun$main$4(PlatformEngine.scala:88)
> at
>
> com.sui.bigdata.PlatformEngine$.$anonfun$main$4$adapted(PlatformEngine.scala:87)
> at com.sui.bigdata.PlatformEngine$$$Lambda$765/1756039478.apply(Unknown
> Source)
> at scala.collection.immutable.List.foreach(List.scala:388)
> at com.sui.bigdata.PlatformEngine$.main(PlatformEngine.scala:87)
> at com.sui.bigdata.PlatformEngine.main(PlatformEngine.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
> ... 13 more
> Caused by: java.lang.RuntimeException:
> java.lang.reflect.InvocationTargetException
> at
>
> org.apache.flink.table.planner.functions.utils.HiveFunctionUtils.invokeGetResultType(HiveFunctionUtils.java:77)
> at
>
> org.apache.flink.table.planner.functions.utils.HiveScalarSqlFunction.lambda$createReturnTypeInference$0(HiveScalarSqlFunction.java:83)
> at
>
> org.apache.flink.table.planner.functions.utils.HiveScalarSqlFunction$$Lambda$780/1747631271.inferReturnType(Unknown
> Source)
> at org.apache.calcite.sql.SqlOperator.inferReturnType(SqlOperator.java:470)
> at
> org.apache.calcite.sql.SqlOperator.validateOperands(SqlOperator.java:437)
> at org.apache.calcite.sql.SqlFunction.deriveType(SqlFunction.java:303)
> at org.apache.calcite.sql.SqlFunction.deriveType(SqlFunction.java:219)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5600)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5587)
> at org.apache.calcite.sql.SqlCall.accept(SqlCall.java:139)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.deriveTypeImpl(SqlValidatorImpl.java:1691)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.deriveType(SqlValidatorImpl.java:1676)
> at
> org.apache.calcite.sql.type.InferTypes.lambda$static$0(InferTypes.java:46)
> at
>
> org.apache.calcite.sql.type.InferTypes$$Lambda$169/1269773610.inferOperandTypes(Unknown
> Source)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.inferUnknownTypes(SqlValidatorImpl.java:1865)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.inferUnknownTypes(SqlValidatorImpl.java:1873)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateWhereOrOn(SqlValidatorImpl.java:4040)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateJoin(SqlValidatorImpl.java:3181)
> at
>
> org.apache.flink.table.planner.calcite.FlinkCalciteSqlValidator.validateJoin(FlinkCalciteSqlValidator.scala:84)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:3113)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:3376)
> at
>
> org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
> at
>
> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:84)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:1008)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:968)
> at
>
> org.apache.calcite.sql.validate.SetopNamespace.validateImpl(SetopNamespace.java:102)
> at
>
> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:84)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:1008)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:968)
> at
> org.apache.calcite.sql.SqlSetOperator.validateCall(SqlSetOperator.java:90)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateCall(SqlValidatorImpl.java:5304)
> at org.apache.calcite.sql.SqlCall.validate(SqlCall.java:116)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:943)
> at
>
> org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:650)
> at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org
>
> $apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:126)
> ... 28 more
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
>
> org.apache.flink.table.planner.functions.utils.HiveFunctionUtils.invokeGetResultType(HiveFunctionUtils.java:73)
> ... 62 more
> Caused by: java.lang.RuntimeException: Char length 0 out of allowed range
> [1, 255]
> at
>
> org.apache.hadoop.hive.serde2.typeinfo.BaseCharUtils.validateCharParameter(BaseCharUtils.java:39)
> at
> org.apache.hadoop.hive.serde2.typeinfo.CharTypeInfo.(CharTypeInfo.java:33)
> at
>
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.createPrimitiveTypeInfo(TypeInfoFactory.java:146)
> at
>
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.getPrimitiveTypeInfo(TypeInfoFactory.java:109)
> at
>
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.getCharTypeInfo(TypeInfoFactory.java:165)
> at
>
> org.apache.flink.table.catalog.hive.util.HiveTypeUtil$TypeInfoLogicalTypeVisitor.visit(HiveTypeUtil.java:191)
> at
>
> org.apache.flink.table.catalog.hive.util.HiveTypeUtil$TypeInfoLogicalTypeVisitor.visit(HiveTypeUtil.java:173)
> at org.apache.flink.table.types.logical.CharType.accept(CharType.java:149)
> at
>
> org.apache.flink.table.catalog.hive.util.HiveTypeUtil.toHiveTypeInfo(HiveTypeUtil.java:84)
> at
>
> org.apache.flink.table.functions.hive.HiveSimpleUDF.getHiveResultType(HiveSimpleUDF.java:127)
> ... 67 more
>
>
>
> --
> Best, Jingsong Lee
>


-- 
Best, Jingsong Lee

回复