Github user marmbrus commented on the pull request:
https://github.com/apache/spark/pull/1055#issuecomment-45820167
Thanks for working on this :)
> We should decide whether or not a non-boolean condition is allowed in a
branch of CaseWhen. Hive throws a SemanticException for this situation and I
think it'd be good to mimic it -- the question is where in the whole Spark SQL
pipeline should we signal an exception for such a query.
Yeah, that is invalid and we should throw an error (here an in many other
places). @liancheng is going to work on a framework for reporting analysis
errors eventually, but I don't think that is super high priority at the moment.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---