tgravescs commented on a change in pull request #32746:
URL: https://github.com/apache/spark/pull/32746#discussion_r645153785
##########
File path: core/src/main/scala/org/apache/spark/util/DependencyUtils.scala
##########
@@ -39,11 +40,11 @@ private[spark] object DependencyUtils extends Logging {
def getIvyProperties(): IvyProperties = {
val Seq(packagesExclusions, packages, repositories, ivyRepoPath,
ivySettingsPath) = Seq(
- "spark.jars.excludes",
- "spark.jars.packages",
- "spark.jars.repositories",
- "spark.jars.ivy",
- "spark.jars.ivySettings"
+ config.JAR_PACKAGES_EXCLUSIONS.key,
Review comment:
import config._ and remove the config. part from here
##########
File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
##########
@@ -2148,4 +2148,37 @@ package object config {
// batch of block will be loaded in memory with memory mapping, which
has higher overhead
// with small MB sized chunk of data.
.createWithDefaultString("3m")
+
+ private[spark] val JAR_IVY_REPO_PATH =
+ ConfigBuilder("spark.jars.ivy")
+ .version("1.3.0")
+ .stringConf
+ .createOptional
+
+ private[spark] val JAR_IVY_SETTING_PATH =
+ ConfigBuilder("spark.jars.ivySettings")
+ .version("2.2.0")
+ .stringConf
+ .createOptional
+
+ private[spark] val JAR_PACKAGES =
+ ConfigBuilder("spark.jars.packages")
+ .version("1.5.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
+
+ private[spark] val JAR_PACKAGES_EXCLUSIONS =
+ ConfigBuilder("spark.jars.excludes")
+ .version("1.5.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
+
+ private[spark] val JAR_REPOSITORIES =
+ ConfigBuilder("spark.jars.repositories")
Review comment:
yes would be great to have docs, they can be copied from the
configuration.md so should be easy to do.
##########
File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
##########
@@ -2148,4 +2148,37 @@ package object config {
// batch of block will be loaded in memory with memory mapping, which
has higher overhead
// with small MB sized chunk of data.
.createWithDefaultString("3m")
+
+ private[spark] val JAR_IVY_REPO_PATH =
+ ConfigBuilder("spark.jars.ivy")
+ .version("1.3.0")
+ .stringConf
+ .createOptional
+
+ private[spark] val JAR_IVY_SETTING_PATH =
+ ConfigBuilder("spark.jars.ivySettings")
+ .version("2.2.0")
+ .stringConf
+ .createOptional
+
+ private[spark] val JAR_PACKAGES =
+ ConfigBuilder("spark.jars.packages")
+ .version("1.5.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
+
+ private[spark] val JAR_PACKAGES_EXCLUSIONS =
+ ConfigBuilder("spark.jars.excludes")
+ .version("1.5.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
+
+ private[spark] val JAR_REPOSITORIES =
+ ConfigBuilder("spark.jars.repositories")
+ .version("2.3.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
Review comment:
yeah I think optional is better here, seems the first 2 are that way,
any reason these were changed?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]