Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19068#discussion_r138619511
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala 
---
    @@ -232,6 +232,54 @@ private[spark] object HiveUtils extends Logging {
       }
     
       /**
    +   * Generate an instance of [[HiveConf]] from [[SparkConf]]& hadoop 
[[Configuration]] &
    +   * formatted extra time configurations with an isolated classloader 
needed if isolationOn
    +   * for [[HiveClient]] construction
    +   * @param sparkConf a [[SparkConf]] object specifying Spark parameters
    +   * @param classLoader an isolated classloader needed if isolationOn for 
[[HiveClient]]
    +   *                    construction
    +   * @param hadoopConf a hadoop [[Configuration]] object, Optional if we 
want generated it from
    +   *                   the sparkConf
    +   * @param extraTimeConfs time configurations in the form of long values 
from the given hadoopConf
    +   */
    +
    +  private[hive] def newHiveConfigurations(
    +      sparkConf: SparkConf = new SparkConf(loadDefaults = true),
    +      classLoader: ClassLoader = null)(
    +      hadoopConf: Configuration = 
SparkHadoopUtil.get.newConfiguration(sparkConf))(
    +      extraTimeConfs: Map[String, String] = 
formatTimeVarsForHiveClient(hadoopConf)): HiveConf = {
    --- End diff --
    
    How about we remove these default values and explicitly specify them in 
https://github.com/apache/spark/pull/19068/files#diff-f7aac41bf732c1ba1edbac436d331a55R84?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to