Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21885#discussion_r206641823
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -2668,17 +2668,24 @@ object SparkContext extends Logging {
       }
     
       /**
    -   * The number of driver cores to use for execution in local mode, 0 
otherwise.
    +   * The number of cores available to the driver to use for tasks such as 
IO with Netty
        */
    -  private[spark] def numDriverCores(master: String): Int = {
    +  private[spark] def numDriverCores(master: String, conf: SparkConf = 
null): Int = {
         def convertToInt(threads: String): Int = {
           if (threads == "*") Runtime.getRuntime.availableProcessors() else 
threads.toInt
         }
         master match {
           case "local" => 1
           case SparkMasterRegex.LOCAL_N_REGEX(threads) => convertToInt(threads)
           case SparkMasterRegex.LOCAL_N_FAILURES_REGEX(threads, _) => 
convertToInt(threads)
    -      case _ => 0 // driver is not used for execution
    +      case "yarn" =>
    --- End diff --
    
    I think you can just write this as `case "yarn" if (conf ...) =>`; not sure 
if it gets too long. Or else I'd move the `else` up a line to fix that style 
below.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to