GitHub user kevinyu98 opened a pull request: https://github.com/apache/spark/pull/10388
[SPARK-12231][SQL]create a combineFilters' projection when we call buildPartitionedTableScan Hello Michael & All: We have some issues to submit the new codes in the other PR(#10299), so we closed that PR and open this one with the fix. The reason for the previous failure is that the projection for the scan when there is a filter that is not pushed down (the "left-over" filter) could be different, in elements or ordering, from the original projection. With this new codes, the approach to solve this problem is: Insert a new Project if the "left-over" filter is nonempty and (the original projection is not empty and the projection for the scan has more than one elements which could otherwise cause different ordering in projection). We create 3 test cases to cover the otherwise failure cases. You can merge this pull request into a Git repository by running: $ git pull https://github.com/kevinyu98/spark spark-12231 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/10388.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #10388 ---- commit 2d56ac02eaff10972e5bc46f3b57cff993d60e24 Author: Kevin Yu <q...@us.ibm.com> Date: 2015-12-18T23:31:05Z another approach to fix this problem commit 305739f872ba90ba9ef4f3ef6c4f812b4024d8e9 Author: Kevin Yu <q...@us.ibm.com> Date: 2015-12-18T23:46:37Z update comments ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org