sigmod commented on pull request #32686:
URL: https://github.com/apache/spark/pull/32686#issuecomment-851729720


   > One small ergonomic comment. I would be great if we can create some 
shorthand for the function closures. I would probably make the in individual 
value be matcher for itself (if Enumeration allows subclassing of the Value 
class), and create a bunch of functions that allow you to compose them, e.g.: 
`any`, `all`, ...
   
   I'm not sure what the transformWithPruning interface exactly looks like.  
IIUC,  transformWithPruning may still not be able to just take a `composed 
pattern` instead of a lambda, because we also have `and`,  `or`, `not` over 
`all`, `any` -- even though they're not frequent. If we'd like to put `and`, 
`or`, `not` into patterns, it sounds a bit complex, as we need to be able to 
process a tree of such compositions.  
   
   Anyway, thanks for the suggestion. I'll think about whether there's a 
simpler approach and may address it subsequent PRs. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to