Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14518#discussion_r73811203
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/orc/OrcOptions.scala ---
    @@ -17,27 +17,40 @@
     
     package org.apache.spark.sql.hive.orc
     
    +import org.apache.hadoop.conf.Configuration
    +
     /**
      * Options for the ORC data source.
      */
     private[orc] class OrcOptions(
    -    @transient private val parameters: Map[String, String])
    +    @transient private val parameters: Map[String, String],
    +    @transient private val conf: Configuration)
       extends Serializable {
     
       import OrcOptions._
     
       /**
    -   * Compression codec to use. By default snappy compression.
    +   * Compression codec to use. By default use the value specified in 
Hadoop configuration.
    +   * If `orc.compress` is unset, then we use snappy.
        * Acceptable values are defined in [[shortOrcCompressionCodecNames]].
        */
       val compressionCodec: String = {
    -    val codecName = parameters.getOrElse("compression", 
"snappy").toLowerCase
    -    if (!shortOrcCompressionCodecNames.contains(codecName)) {
    -      val availableCodecs = 
shortOrcCompressionCodecNames.keys.map(_.toLowerCase)
    -      throw new IllegalArgumentException(s"Codec [$codecName] " +
    -        s"is not available. Available codecs are 
${availableCodecs.mkString(", ")}.")
    +    val default = conf.get(OrcRelation.ORC_COMPRESSION, "SNAPPY")
    --- End diff --
    
    Sorry. Maybe I did not explain clearly in the jira. The use case I 
mentioned was `df.write.option("orc.compress", ...)`. We do not need to look at 
hadoop conf. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to