Mustafa İman created HIVE-25040:
-----------------------------------

             Summary: Drop database cascade cannot remove persistent functions
                 Key: HIVE-25040
                 URL: https://issues.apache.org/jira/browse/HIVE-25040
             Project: Hive
          Issue Type: Bug
            Reporter: Mustafa İman
            Assignee: Mustafa İman


Add a persistent custom function to a database using a Jar file: CREATE 
FUNCTION myfunction USING JAR 'x.jar';

Restart the session and immediately issue DROP DATABASE mydb CASCADE. It throws 
ClassCastException:
{code:java}
java.lang.ClassNotFoundException: DummyUDF
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382) 
~[?:1.8.0_282]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_282]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_282]
        at java.lang.Class.forName0(Native Method) ~[?:1.8.0_282]
        at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_282]
        at 
org.apache.hadoop.hive.ql.exec.Registry.getPermanentUdfClass(Registry.java:549) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.exec.Registry.removePersistentFunctionUnderLock(Registry.java:586)
 ~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.exec.Registry.unregisterFunction(Registry.java:577) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.exec.Registry.unregisterFunctions(Registry.java:607) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.exec.FunctionRegistry.unregisterPermanentFunctions(FunctionRegistry.java:1731)
 ~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.ddl.database.drop.DropDatabaseOperation.execute(DropDatabaseOperation.java:62)
 ~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.ddl.DDLTask.execute(DDLTask.java:80) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:213) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:357) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:330) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:246) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:109) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:748) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:497) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:491) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:166) 
~[hive-exec-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:225)
 ~[hive-service-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87)
 ~[hive-service-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:322)
 ~[hive-service-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at java.security.AccessController.doPrivileged(Native Method) 
~[?:1.8.0_282]
        at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_282]
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
 ~[hadoop-common-3.1.1.7.2.10.0-36.jar:?]
        at 
org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:340)
 ~[hive-service-3.1.3000.7.2.10.0-36.jar:3.1.3000.7.2.10.0-36]
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
~[?:1.8.0_282]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
~[?:1.8.0_282]
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
~[?:1.8.0_282]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
~[?:1.8.0_282]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
~[?:1.8.0_282]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
~[?:1.8.0_282]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_282]

{code}
 

Since new session did not use the custom udf before trying to drop it, session 
state does not have the class loaded. Therefore it throws ClassNotFound 
exception. In this case we can ignore this exception since we do not need to 
reload it just to remove afterwards.

Same thing does not happen with DROP FUNCTION because compiler loads the jar 
explicitly before trying to drop the function.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to