Github user gatorsmile commented on the issue:
https://github.com/apache/spark/pull/21360
@TomaszGaweda @maryannxue Let us reduce the complexity and introduce a new
JDBC option for controlling the predicate pushdown.
```Scala
val JDBC_FILTER_PUSHDOWN_ENABLED =
buildConf("spark.sql.jdbc.filterPushdown")
.doc("Enables JDBC filter push-down optimization when set to true.")
.booleanConf
.createWithDefault(true)
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]