Github user rdblue commented on the pull request:

    https://github.com/apache/spark/pull/13236#issuecomment-221074125
  
    Why is Hive's ClassLoader loading Hadoop classes itself rather than 
delegating to the ClassLoader that is responsible for Hadoop? Hive should be 
using shims to interact with whatever Hadoop classes are available. Is there a 
reason not to share Hadoop classes?
    
    I think rebuilding a Configuration and passing it is fine, but then you 
have the problem of making an exact replica of a Configuration object. I'm not 
sure what needs to happen (other than at least the final properties). And that 
still doesn't keep the behavior or addDefaultResource. Like I said, I think the 
ideal solution is to use the incoming Configuration. Otherwise you can probably 
get away with copying properties and preserving which properties are final.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to