Github user dongjoon-hyun commented on a diff in the pull request:
https://github.com/apache/spark/pull/19218#discussion_r139186318
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala
---
@@ -101,6 +101,13 @@ case class InsertIntoHiveTable(
val tmpLocation = getExternalTmpPath(sparkSession, hadoopConf,
tableLocation)
val fileSinkConf = new FileSinkDesc(tmpLocation.toString, tableDesc,
false)
+ tableDesc.getOutputFileFormatClassName match {
+ case
"org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat" =>
--- End diff --
- **Parquet**: It seems that you need to consider another output format,
[parquet.hive.DeprecatedParquetOutputFormat](https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLParserSuite.scala#L1173),
too.
- **ORC**: We have
[spark.sql.orc.compression.codec](https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L320)
by SPARK-21839.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]