Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4571#discussion_r24617668
  
    --- Diff: core/src/main/scala/org/apache/spark/metrics/MetricsConfig.scala 
---
    @@ -47,21 +47,27 @@ private[spark] class MetricsConfig(val configFile: 
Option[String]) extends Loggi
         setDefaultProperties(properties)
     
         // If spark.metrics.conf is not set, try to get file in class path
    -    var is: InputStream = null
    -    try {
    -      is = configFile match {
    -        case Some(f) => new FileInputStream(f)
    -        case None => 
Utils.getSparkClassLoader.getResourceAsStream(METRICS_CONF)
    +    (configFile match {
    --- End diff --
    
    I've been thinking about style quite a bit recently as I'm working on a 
style guide. In this case, you'd want to break the chain down with intermediate 
variables. Otherwise the cognitive complexity of the chaining/nesting makes it 
hard to understand.
    
    I wrote 3 versions of them, with each version gradually becoming more 
Java-like. I think either would be fine here.
    
    version 1
    ```scala
    val isOpt: Option[InputStream] = configFile.map(new 
FileInputStream(_)).orElse {
      try {
        Option(Utils.getSparkClassLoader.getResourceAsStream(METRICS_CONF))
      } catch {
        case e: Exception =>
          logError("Error loading default configuration file", e)
          None
      }
    }
    
    isOpt.foreach { is =>
      try {
        properties.load(is)
      } finally {
        is.close()
      }
    }
    ```
    
    version 2
    ```scala
    var isOpt: Option[InputStream] = configFile.map { f =>
      logInfo("Loading MetricsConfig file: $f")
      new FileInputStream(f)
    }
    
    if (!isOpt.isDefined) {
      try {
        is = Option(Utils.getSparkClassLoader.getResourceAsStream(METRICS_CONF))
      } catch {
        case e: Exception =>
          logError("Error loading default configuration file", e)
      }
    }
    
    isOpt.foreach { is =>
      try {
        properties.load(is)
      } finally {
        is.close()
      }
    }
    ```
    
    version 3
    ```scala
    var is: InputStream = configFile.map(new FileInputStream(_)).orNull
    
    if (is == null) {
      try {
        is = Option(Utils.getSparkClassLoader.getResourceAsStream(METRICS_CONF))
      } catch {
        case e: Exception =>
          logError("Error loading default configuration file", e)
      }
    }
    
    if (is != null) {
      try {
        properties.load(is)
      } finally {
        is.close()
      }
    }
    ```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to