pan3793 commented on PR #46520: URL: https://github.com/apache/spark/pull/46520#issuecomment-2105463104
@dongjoon-hyun Okay, as this fail the CI, we should revert deps removing first and do more investigation later. While for supporting "legacy Hive UDF jars", I think if the user imports some classes that come from Hive transitive deps, Spark is not responsible for handling that. For example, Spark only includes part of Hive deps (I mean the all jars shipped by Hive binary tgz), for example, Hive ships `groovy-all-2.4.4.jar` but Spark does not, if user's UDF imports classes from `groovy-all-2.4.4.jar`, it should fail on Spark due to ClassNotFound, in this case user should add thrid party deps by themselves. I believe jodd/commons-lang 2.x/jackson 1.x are in same position. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
