Github user dongjoon-hyun commented on a diff in the pull request:
https://github.com/apache/spark/pull/19547#discussion_r146109395
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala ---
@@ -585,6 +585,35 @@ private[client] class Shim_v0_13 extends Shim_v0_12 {
* Unsupported predicates are skipped.
*/
def convertFilters(table: Table, filters: Seq[Expression]): String = {
+ if (SQLConf.get.advancedPartitionPredicatePushdownEnabled) {
+ convertComplexFilters(table, filters)
+ } else {
+ convertBasicFilters(table, filters)
+ }
--- End diff --
Nit, Can we remove the duplication of `varcharKeys` logic by moving into
`convertFilters` like the following?
```scala
def convertFilters(table: Table, filters: Seq[Expression]): String = {
// hive varchar is treated as catalyst string, but hive varchar can't
be pushed down.
lazy val varcharKeys = table.getPartitionKeys.asScala
.filter(col =>
col.getType.startsWith(serdeConstants.VARCHAR_TYPE_NAME) ||
col.getType.startsWith(serdeConstants.CHAR_TYPE_NAME))
.map(col => col.getName).toSet
if (SQLConf.get.advancedPartitionPredicatePushdownEnabled) {
convertComplexFilters(table, filters, varcharKeys)
} else {
convertBasicFilters(table, filters, varcharKeys)
}
}
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]