Github user hvanhovell commented on the pull request:

    https://github.com/apache/spark/pull/9419#issuecomment-153146768
  
    If I understand the problem correctly, the logical Expand operator makes 
items which are not in the grouping set ```null```. This means that if a column 
is both used for grouping and for aggregation and that column is not in the 
grouping set, this column will be set to ```null``` and the aggregate will also 
become ```null``` (which is not correct).
    
    What I don't understand is why we are not fixing the logical Expand by 
introducing two sets of expressions, one for grouping and one for aggregation. 
This is much simpler and fixes the problem at its root.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to