srowen commented on a change in pull request #23550: [SPARK-26621][CORE]Use 
ConfigEntry for hardcoded configs for shuffle categories.
URL: https://github.com/apache/spark/pull/23550#discussion_r248308853
 
 

 ##########
 File path: 
core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala
 ##########
 @@ -253,7 +253,7 @@ class ExternalAppendOnlyMapSuite extends SparkFunSuite
   private def testSimpleSpilling(codec: Option[String] = None, encrypt: 
Boolean = false): Unit = {
     val size = 1000
     val conf = createSparkConf(loadDefaults = true, codec)  // Load defaults 
for Spark home
-    conf.set("spark.shuffle.spill.numElementsForceSpillThreshold", (size / 
4).toString)
+    conf.set(SHUFFLE_SPILL_NUM_ELEMENTS_FORCE_SPILL_THRESHOLD.key, (size / 
4).toString)
 
 Review comment:
   One broad question about most of these changes: can this not be:
   `conf.set(SHUFFLE_SPILL_NUM_ELEMENTS_FORCE_SPILL_THRESHOLD, size / 4)`?
   In Scala at least? I thought the idea of this typed API was that you could 
pass a typed ConfigEntry and matching typed argument.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to