Github user sameeragarwal commented on the pull request:
https://github.com/apache/spark/pull/11618#issuecomment-195497597
sounds good, thank you. In my branch, I try to address (2) by not adding
new conditions if the child node(s) already have the given constraint. For (3),
please note that pushing `b = c` down isn't useless if `a` comes from the left
side of the join and `b` and `c` come from the right. It is of course useless
if `b` and `c` come from different sides of the join. I think we can have a
slightly smarter filter inference rule for joins to identify the latter.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]