wangyum commented on a change in pull request #25542:
[SPARK-28840][SQL][test-hadoop3.2]conf.getClassLoader in SparkSQLCLIDriver
should be avoided as it returns the UDFClassLoader which is created by Hive
URL: https://github.com/apache/spark/pull/25542#discussion_r316600772
##########
File path:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
##########
@@ -140,7 +141,7 @@ private[hive] object SparkSQLCLIDriver extends Logging {
// Hadoop-20 and above - we need to augment classpath using hiveconf
// components.
// See also: code in ExecDriver.java
- var loader = conf.getClassLoader
+ var loader = orginalClassLoader
val auxJars = HiveConf.getVar(conf, HiveConf.ConfVars.HIVEAUXJARS)
if (StringUtils.isNotBlank(auxJars)) {
Review comment:
Please add another test case to cover `Utilities.addToClassPath(loader,
StringUtils.split(auxJars, ","))`.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]