Yin Huai created SPARK-11661:
--------------------------------
Summary: We should still pushdown filters returned by a data
source's unhandledFilters
Key: SPARK-11661
URL: https://issues.apache.org/jira/browse/SPARK-11661
Project: Spark
Issue Type: Bug
Components: SQL
Reporter: Yin Huai
Priority: Blocker
We added unhandledFilters interface to SPARK-10978. So, a data source has a
chance to let Spark SQL know that for those returned filters, it is possible
that the data source will not apply them to every row. So, Spark SQL should use
a Filter operator to evaluate those filters. However, if a filter is a part of
returned unhandledFilters, we should still push it down. For example, our
internal data sources do not override this method, if we do not push down those
filters, we are actually turning off the filter pushdown feature.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]