Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19760#discussion_r151374800
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -663,8 +663,10 @@ private[spark] object SparkConf extends Logging {
           AlternateConfig("spark.yarn.jar", "2.0")),
         "spark.yarn.access.hadoopFileSystems" -> Seq(
           AlternateConfig("spark.yarn.access.namenodes", "2.2")),
    -    "spark.maxRemoteBlockSizeFetchToMem" -> Seq(
    -      AlternateConfig("spark.reducer.maxReqSizeShuffleToMem", "2.3"))
    +    MAX_REMOTE_BLOCK_SIZE_FETCH_TO_MEM.key -> Seq(
    +      AlternateConfig("spark.reducer.maxReqSizeShuffleToMem", "2.3")),
    +    LISTENER_BUS_EVENT_QUEUE_CAPACITY.key -> Seq(
    +      AlternateConfig("spark.scheduler.listenerbus.eventqueue.size", 
"2.3"))
    --- End diff --
    
    Now it only works with Spark Core confs right?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to