[
https://issues.apache.org/jira/browse/SPARK-28710?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16907681#comment-16907681
]
Dongjoon Hyun commented on SPARK-28710:
---------------------------------------
Thank you for reporting this, [~abhishek.akg].
Thank you for pinging me, [~sandeep.katta2007]. I'll review your PR.
> [UDF] create or replace permanent function does not clear the jar in class
> path
> -------------------------------------------------------------------------------
>
> Key: SPARK-28710
> URL: https://issues.apache.org/jira/browse/SPARK-28710
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: ABHISHEK KUMAR GUPTA
> Priority: Major
>
> 0: jdbc:hive2://10.18.19.208:23040/default> create function addDoubles AS
> 'com.huawei.bigdata.hive.example.udf.AddDoublesUDF' using jar
> 'hdfs://hacluster/user/AddDoublesUDF.jar';
> +---------+
> | Result |
> +---------+
> +---------+
> No rows selected (0.216 seconds)
> 0: jdbc:hive2://10.18.19.208:23040/default> create or replace function
> addDoubles AS 'com.huawei.bigdata.hive.example.udf.multiply' using jar
> 'hdfs://hacluster/user/Multiply.jar';
> +---------+
> | Result |
> +---------+
> +---------+
> No rows selected (0.292 seconds)
> 0: jdbc:hive2://10.18.19.208:23040/default> select addDoubles(3,3);
> INFO : Added
> [/tmp/8f3d7e87-469e-45e9-b5d1-7c714c5e0183_resources/AddDoublesUDF.jar] to
> class path
> INFO : Added resources: [hdfs://hacluster/user/AddDoublesUDF.jar]
> INFO : Added
> [/tmp/8f3d7e87-469e-45e9-b5d1-7c714c5e0183_resources/AddDoublesUDF.jar] to
> class path
> INFO : Added resources: [hdfs://hacluster/user/AddDoublesUDF.jar]
> Error: org.apache.spark.sql.AnalysisException: Can not load class
> 'com.huawei.bigdata.hive.example.udf.multiply' when registering the function
> 'default.addDoubles', please make sure it is on the classpath; line 1 pos 7
> (state=,code=0)
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]