Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18083#discussion_r118431597
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/LiveListenerBus.scala ---
    @@ -33,25 +37,24 @@ import org.apache.spark.util.Utils
      * has started will events be actually propagated to all attached 
listeners. This listener bus
      * is stopped when `stop()` is called, and it will drop further events 
after stopping.
      */
    -private[spark] class LiveListenerBus(val sparkContext: SparkContext) 
extends SparkListenerBus {
    +private[spark] class LiveListenerBus(conf: SparkConf) extends 
SparkListenerBus {
     
       self =>
     
       import LiveListenerBus._
     
    +  private var sparkContext: SparkContext = _
    +
       // Cap the capacity of the event queue so we get an explicit error 
(rather than
       // an OOM exception) if it's perpetually being added to more quickly 
than it's being drained.
    -  private lazy val EVENT_QUEUE_CAPACITY = validateAndGetQueueSize()
    -  private lazy val eventQueue = new 
LinkedBlockingQueue[SparkListenerEvent](EVENT_QUEUE_CAPACITY)
    -
    -  private def validateAndGetQueueSize(): Int = {
    -    val queueSize = sparkContext.conf.get(LISTENER_BUS_EVENT_QUEUE_SIZE)
    -    if (queueSize <= 0) {
    -      throw new 
SparkException("spark.scheduler.listenerbus.eventqueue.size must be > 0!")
    -    }
    -    queueSize
    +  private val eventQueue = {
    +    val capacity = conf.get(LISTENER_BUS_EVENT_QUEUE_SIZE)
    +    require(capacity > 0, s"${LISTENER_BUS_EVENT_QUEUE_SIZE.key} must be > 
0!")
    --- End diff --
    
    Nice, I didn't know about that. I'll move it in my next update.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to