maropu commented on a change in pull request #29966:
URL: https://github.com/apache/spark/pull/29966#discussion_r530094153
##########
File path: core/src/main/scala/org/apache/spark/util/Utils.scala
##########
@@ -2980,6 +2980,75 @@ private[spark] object Utils extends Logging {
metadata.toString
}
+ /**
+ * Download Ivy URIs dependent jars.
+ *
+ * @param uri Ivy uri need to be downloaded.
+ * @return Comma separated string list of URIs of downloaded jars
+ */
+ def resolveMavenDependencies(uri: URI): String = {
+ val Seq(repositories, ivyRepoPath, ivySettingsPath) =
+ Seq(
+ "spark.jars.repositories",
+ "spark.jars.ivy",
+ "spark.jars.ivySettings"
+ ).map(sys.props.get(_).orNull)
+ // Create the IvySettings, either load from file or build defaults
+ val ivySettings = Option(ivySettingsPath) match {
+ case Some(path) =>
+ SparkSubmitUtils.loadIvySettings(path, Option(repositories),
Option(ivyRepoPath))
Review comment:
> Maybe we can pull this out into a common utility that can be leveraged
here and DriverWrapper? Then there is no need for testing twice.
Yea, if we can, it looks better.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]