[
https://issues.apache.org/jira/browse/FLINK-33233?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17804607#comment-17804607
]
Matthias Pohl commented on FLINK-33233:
---------------------------------------
Out of curiosity: Is this really an improvement? It sounds more like a bug to
me. If that's the case, it might be worth backporting it to 1.18 and 1.17 as
well (since the affected version is stated as 1.17.0 in this Jira issue).
[~luoyuxia] [[email protected]] WDYT?
> Null point exception when non-native udf used in join condition
> ---------------------------------------------------------------
>
> Key: FLINK-33233
> URL: https://issues.apache.org/jira/browse/FLINK-33233
> Project: Flink
> Issue Type: Improvement
> Components: Connectors / Hive
> Affects Versions: 1.17.0
> Reporter: yunfan
> Assignee: yunfan
> Priority: Major
> Labels: pull-request-available
> Fix For: 1.19.0
>
>
> Any non-native udf used in hive-parser join condition.
> It will caused NullPointException.
> It can reproduced by follow code by adding this test to
> {code:java}
> org.apache.flink.connectors.hive.HiveDialectQueryITCase{code}
>
> {code:java}
> // Add follow code to org.apache.flink.connectors.hive.HiveDialectQueryITCase
> @Test
> public void testUdfInJoinCondition() throws Exception {
> List<Row> result = CollectionUtil.iteratorToList(tableEnv.executeSql(
> "select foo.y, bar.I from bar join foo on hiveudf(foo.x) = bar.I
> where bar.I > 1").collect());
> assertThat(result.toString())
> .isEqualTo("[+I[2, 2]]");
> } {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)