AngersZhuuuu commented on a change in pull request #29966:
URL: https://github.com/apache/spark/pull/29966#discussion_r531939805
##########
File path: core/src/main/scala/org/apache/spark/util/Utils.scala
##########
@@ -2980,6 +2980,77 @@ private[spark] object Utils extends Logging {
metadata.toString
}
+ /**
+ * Download Ivy URIs dependent jars.
+ *
+ * @param uri Ivy uri need to be downloaded.
+ * @return Comma separated string list of URIs of downloaded jars
+ */
+ def resolveMavenDependencies(uri: URI): String = {
+ val Seq(repositories, ivyRepoPath, ivySettingsPath) =
+ Seq(
+ "spark.jars.repositories",
+ "spark.jars.ivy",
+ "spark.jars.ivySettings"
+ ).map(sys.props.get(_).orNull)
+ // Create the IvySettings, either load from file or build defaults
+ val ivySettings = Option(ivySettingsPath) match {
+ case Some(path) =>
+ SparkSubmitUtils.loadIvySettings(path, Option(repositories),
Option(ivyRepoPath))
Review comment:
> Some of this logic is duplicated from
`DependencyUtils.resolveMavenDependencies`, seems it will be better if we can
unify? `DependencyUtils` seems like a more suitable place for this logic anyway.
>
> With the addition of `Utils.resolveMavenDependencies` we have three
identically-named methods in 3 different utility classes... I worry this will
become very confusing. Better to consolidate.
Yea, we can merge it and updated now, how about current change
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]