dgd-contributor commented on a change in pull request #32746:
URL: https://github.com/apache/spark/pull/32746#discussion_r645223037
##########
File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
##########
@@ -2148,4 +2148,37 @@ package object config {
// batch of block will be loaded in memory with memory mapping, which
has higher overhead
// with small MB sized chunk of data.
.createWithDefaultString("3m")
+
+ private[spark] val JAR_IVY_REPO_PATH =
+ ConfigBuilder("spark.jars.ivy")
+ .version("1.3.0")
+ .stringConf
+ .createOptional
+
+ private[spark] val JAR_IVY_SETTING_PATH =
+ ConfigBuilder("spark.jars.ivySettings")
+ .version("2.2.0")
+ .stringConf
+ .createOptional
+
+ private[spark] val JAR_PACKAGES =
+ ConfigBuilder("spark.jars.packages")
+ .version("1.5.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
+
+ private[spark] val JAR_PACKAGES_EXCLUSIONS =
+ ConfigBuilder("spark.jars.excludes")
+ .version("1.5.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
+
+ private[spark] val JAR_REPOSITORIES =
+ ConfigBuilder("spark.jars.repositories")
+ .version("2.3.0")
+ .stringConf
+ .toSequence
+ .createWithDefault(Nil)
Review comment:
3 last config are comma-separated list, so I thought it's would be
better to specify it with Nil value since I found some previous configs do that
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]