[ 
https://issues.apache.org/jira/browse/FLINK-26355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17497891#comment-17497891
 ] 

luoyuxia commented on FLINK-26355:
----------------------------------

[~zoucao] Thanks for reporting it.  Seems you're using hive dialect? Yes, you 
can rewrite `getHiveResultType` to fix it.
But to work around it, you can change you sql to 
{code:sql}
 json_tuple(tb.json, repeat('f1', 1), repeat('f2', 1)) 
{code}
Hopes it can help.

But, anyway, the long running way is to use new type inference as 
[FLINK-26364|https://issues.apache.org/jira/browse/FLINK-26364] described.



> VarCharType was not be considered in HiveTableSqlFunction
> ---------------------------------------------------------
>
>                 Key: FLINK-26355
>                 URL: https://issues.apache.org/jira/browse/FLINK-26355
>             Project: Flink
>          Issue Type: Improvement
>          Components: Table SQL / Ecosystem
>            Reporter: zoucao
>            Priority: Major
>
> VarCharType was not be considered in `HiveTableSqlFunction#coerce`, see 
> [link|https://github.com/apache/flink/blob/a7192af8707f3f0d0f30fc71f3477edd92135cac/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/functions/utils/HiveTableSqlFunction.java#L146],
>  before invoke `HiveTableSqlFunction#coerce`, flink will call the method 
> `createFieldTypeFromLogicalType` to build argumentsArray, if the field's type 
> is varchar, the exception will be thrown.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to