Github user dilipbiswal commented on a diff in the pull request:
https://github.com/apache/spark/pull/21403#discussion_r206254993
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
---
@@ -1422,11 +1422,26 @@ class Analyzer(
resolveSubQuery(s, plans)(ScalarSubquery(_, _, exprId))
case e @ Exists(sub, _, exprId) if !sub.resolved =>
resolveSubQuery(e, plans)(Exists(_, _, exprId))
- case In(value, Seq(l @ ListQuery(sub, _, exprId, _))) if
value.resolved && !l.resolved =>
+ case In(values, Seq(l @ ListQuery(_, _, exprId, _)))
+ if values.forall(_.resolved) && !l.resolved =>
val expr = resolveSubQuery(l, plans)((plan, exprs) => {
ListQuery(plan, exprs, exprId, plan.output)
})
- In(value, Seq(expr))
+ val subqueryOutput = expr.plan.output
+ val resolvedIn = In(values, Seq(expr))
+ if (values.length != subqueryOutput.length) {
+ throw new AnalysisException(
--- End diff --
@mgaido91 We have this check in checkInputDataTypes and here ? Is there a
way we can have the number of input check in one place ?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]