Github user zsxwing commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22473#discussion_r221022651
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -609,13 +609,13 @@ class SparkConf(loadDefaults: Boolean) extends 
Cloneable with Logging with Seria
         require(!encryptionEnabled || get(NETWORK_AUTH_ENABLED),
           s"${NETWORK_AUTH_ENABLED.key} must be enabled when enabling 
encryption.")
     
    -    val executorTimeoutThreshold = 
getTimeAsSeconds("spark.network.timeout", "120s")
    -    val executorHeartbeatInterval = 
getTimeAsSeconds("spark.executor.heartbeatInterval", "10s")
    +    val executorTimeoutThreshold = getTimeAsMs("spark.network.timeout", 
"120s")
    --- End diff --
    
    Could you change `getTimeAsMs` back to `getTimeAsSeconds`? There is a 
slight difference when the user doesn't specify the time unit. `getTimeAsMs` 
uses `ms` as default, while `getTimeAsSeconds` uses `seconds`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to