[
https://issues.apache.org/jira/browse/SPARK-11661?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Yin Huai updated SPARK-11661:
-----------------------------
Fix Version/s: (was: 1.7.0)
> We should still pushdown filters returned by a data source's unhandledFilters
> -----------------------------------------------------------------------------
>
> Key: SPARK-11661
> URL: https://issues.apache.org/jira/browse/SPARK-11661
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Yin Huai
> Assignee: Yin Huai
> Priority: Blocker
> Fix For: 1.6.0
>
>
> We added unhandledFilters interface to SPARK-10978. So, a data source has a
> chance to let Spark SQL know that for those returned filters, it is possible
> that the data source will not apply them to every row. So, Spark SQL should
> use a Filter operator to evaluate those filters. However, if a filter is a
> part of returned unhandledFilters, we should still push it down. For example,
> our internal data sources do not override this method, if we do not push down
> those filters, we are actually turning off the filter pushdown feature.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]