Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/6758#issuecomment-111038774
  
    Actually, the problem is the class loader associated with the `HiveConf` 
inside the `ClientWrapper.state`. For execution Hive, this classloader will be 
the one used to load `HiveContext`. Then, when we call `withHiveState`, we 
always call `setCurrentSessionState` with this `state`. Inside 
`setCurrentSessionState`, we are doing
    ```
    public static void setCurrentSessionState(SessionState startSs) {
        tss.set(startSs);
        
Thread.currentThread().setContextClassLoader(startSs.getConf().getClassLoader());
      }
    ```
    Basically, we use the classloader of the conf associated with the state to 
set the context class loader of the current thread.
    
    Inside `AddJar` command, although we are doing 
`org.apache.hadoop.hive.ql.metadata.Hive.get().getConf().setClassLoader(newClassLoader)`,
 this one actually may not set the class loader to the conf associated with 
`executionHive.state` because `org.apache.hadoop.hive.ql.metadata.Hive.get()` 
returns a thread local variable. So, its conf may not point to the same conf 
inside the `executionHive.state`.
    
    Also cc @JoshRosen 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to