pan3793 commented on pull request #33989:
URL: https://github.com/apache/spark/pull/33989#issuecomment-919689151


   > Spark today already leaks these dependencies
   
   Yeah, but users can see what jars in `$SPARK_HOME/jars` and 1) align 
dependencies' version with Spark when build Spark application, 2) exclude jars 
already under `$SPARK_HOME/jars` when package or submit Spark job.
   
   For `hive-shaded`, the user has no idea what jars should be aligned and 
exclude.
   
   I think `hive-shaded` should relocate all classes except `org.apache.hive`, 
which makes things simple, the user just needs to exclude all hive vanilla jars.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to