Github user HyukjinKwon commented on the pull request:

    https://github.com/apache/spark/pull/10510#issuecomment-168100789
  
    @davies @squito the purpose of `stripSparkFilter` is not to copy but strip 
the wrapped Spark-side filter. I am not too sure if it is right to modify the 
results of stripped plans although I understand and agree that we anyway need 
to execute such logics regardless of the inside or the outside of 
`stripSparkFilter`. 
    
    FYI, this function would be only used in ORC at the end. The reason why I 
added the function is we need to test the pushed filters correctly until 
`unhandledFilter` interface is implemented.
    For Parquet and JDBC, that function would not be used anymore. Here are the 
PR for Parquet, https://github.com/apache/spark/pull/10502, and for JDBC, 
https://github.com/apache/spark/pull/10427. They were discussed in this PR, 
https://github.com/apache/spark/pull/10221


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to