xkrogen commented on a change in pull request #29874:
URL: https://github.com/apache/spark/pull/29874#discussion_r505891657
##########
File path:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
##########
@@ -61,7 +61,8 @@ private[hive] object IsolatedClientLoader extends Logging {
val files = if (resolvedVersions.contains((resolvedVersion,
hadoopVersion))) {
resolvedVersions((resolvedVersion, hadoopVersion))
} else {
- val remoteRepos = sparkConf.get(SQLConf.ADDITIONAL_REMOTE_REPOSITORIES)
+ val remoteRepos = sys.env.getOrElse(
+ "DEFAULT_ARTIFACT_REPOSITORY",
sparkConf.get(SQLConf.ADDITIONAL_REMOTE_REPOSITORIES))
Review comment:
I was looking at something similar today and realized that what I said
doesn't quite make sense, since you can't really externally configure
`spark.sql.maven.additionalRemoteRepositories` when running the tests via Maven.
I'm still not sure if configuring the default value of a config based on an
environment variable makes sense, though. Maybe it would be better to update
the tests somewhere (if that is indeed your concern).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]