wangyum commented on a change in pull request #25542:
[SPARK-28840][SQL][test-hadoop3.2]conf.getClassLoader in SparkSQLCLIDriver
should be avoided as it returns the UDFClassLoader which is created by Hive
URL: https://github.com/apache/spark/pull/25542#discussion_r316600471
##########
File path:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
##########
@@ -140,7 +141,7 @@ private[hive] object SparkSQLCLIDriver extends Logging {
// Hadoop-20 and above - we need to augment classpath using hiveconf
// components.
// See also: code in ExecDriver.java
- var loader = conf.getClassLoader
+ var loader = orginalClassLoader
Review comment:
I think this is a correct fix. Another approach is add the Spark jars to
`Utilities.addToClassPath` to make `UDFClassLoader` work:
```scala
val sparkJars = sparkConf.get(org.apache.spark.internal.config.JARS)
if (sparkJars.nonEmpty || StringUtils.isNotBlank(auxJars)) {
loader = Utilities.addToClassPath(loader, sparkJars.toArray ++
StringUtils.split(auxJars, ","))
}
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]