[ 
https://issues.apache.org/jira/browse/FLINK-24862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated FLINK-24862:
-----------------------------------
    Labels: pull-request-available  (was: )

> The user-defined hive udaf/udtf cannot be used normally in hive dialect
> -----------------------------------------------------------------------
>
>                 Key: FLINK-24862
>                 URL: https://issues.apache.org/jira/browse/FLINK-24862
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive
>    Affects Versions: 1.13.0, 1.14.0
>            Reporter: xiangqiao
>            Priority: Major
>              Labels: pull-request-available
>         Attachments: image-2021-11-10-20-55-11-988.png, 
> image-2021-11-10-21-04-32-660.png
>
>
> Here are two questions:
> 1.First question, I added a unit test in HiveDialectITCase to reproduce this 
> question:
> {code:java}
> @Test
> public void testTemporaryFunctionUDAF() throws Exception {
>     // create temp function
>     tableEnv.executeSql(
>             String.format(
>                     "create temporary function temp_count as '%s'",
>                     GenericUDAFCount.class.getName()));
>     String[] functions = tableEnv.listUserDefinedFunctions();
>     assertArrayEquals(new String[] {"temp_count"}, functions);
>     // call the function
>     tableEnv.executeSql("create table src(x int)");
>     tableEnv.executeSql("insert into src values (1),(-1)").await();
>     assertEquals(
>             "[+I[2]]",
>             queryResult(tableEnv.sqlQuery("select temp_count(x) from 
> src")).toString());
>     // switch DB and the temp function can still be used
>     tableEnv.executeSql("create database db1");
>     tableEnv.useDatabase("db1");
>     assertEquals(
>             "[+I[2]]",
>             queryResult(tableEnv.sqlQuery("select temp_count(x) from 
> `default`.src"))
>                     .toString());
>     // drop the function
>     tableEnv.executeSql("drop temporary function temp_count");
>     functions = tableEnv.listUserDefinedFunctions();
>     assertEquals(0, functions.length);
>     tableEnv.executeSql("drop temporary function if exists foo");
> } {code}
> !image-2021-11-10-20-55-11-988.png!
> 2.When I solved the first problem, I met the second problem,I added a unit 
> test in HiveDialectITCase to reproduce this question:
> This is the compatibility of hive udtf. Refer to this 
> issue:https://issues.apache.org/jira/browse/HIVE-5737
> {code:java}
> @Test
> public void testTemporaryFunctionUDTFInitializeWithStructObjectInspector() 
> throws Exception {
>     // create temp function
>     tableEnv.executeSql(
>             String.format(
>                     "create temporary function temp_split as '%s'",
>                     
> HiveGenericUDTFTest.TestSplitUDTFInitializeWithStructObjectInspector.class
>                             .getName()));
>     String[] functions = tableEnv.listUserDefinedFunctions();
>     assertArrayEquals(new String[] {"temp_split"}, functions);
>     // call the function
>     tableEnv.executeSql("create table src(x string)");
>     tableEnv.executeSql("insert into src values ('a,b,c')").await();
>     assertEquals(
>             "[+I[a], +I[b], +I[c]]",
>             queryResult(tableEnv.sqlQuery("select temp_split(x) from 
> src")).toString());
>     // switch DB and the temp function can still be used
>     tableEnv.executeSql("create database db1");
>     tableEnv.useDatabase("db1");
>     assertEquals(
>             "[+I[a], +I[b], +I[c]]",
>             queryResult(tableEnv.sqlQuery("select temp_split(x) from 
> `default`.src"))
>                     .toString());
>     // drop the function
>     tableEnv.executeSql("drop temporary function temp_split");
>     functions = tableEnv.listUserDefinedFunctions();
>     assertEquals(0, functions.length);
>     tableEnv.executeSql("drop temporary function if exists foo");
> } {code}
> !image-2021-11-10-21-04-32-660.png!



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to