Github user smola commented on the pull request:

    https://github.com/apache/spark/pull/6853#issuecomment-113169112
  
    @marmbrus Yes, this patch is meant just to delay the check until check 
analysis. The reason is that just because ResolveReferences rule cannot resolve 
the plan, that does not mean that there is no other rule resolving it. I think 
this is the main idea behind how rules work in catalyst, right? Each rule takes 
care of what it knows and ignores the unknown.
    
    With respect my use case, my custom logical plan does produce new 
attributes. I have also added resolution rules for it on my side. So yes, 
analysis is checked. But the current problem with this is that I need to 
maintain a copy of ResolveReferences (i.e. FixedResolveReferences) in my code, 
instead of just adding my new logic in ResolveMyCustomPlan. Then I have to 
override SQLContext and the analyzer just to be able to replace the default 
rule with mine.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to