Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/22075#discussion_r209426401
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/higherOrderFunctions.scala
---
@@ -90,18 +90,25 @@ object LambdaFunction {
*/
trait HigherOrderFunction extends Expression {
- override def children: Seq[Expression] = inputs ++ functions
+ override def children: Seq[Expression] = arguments ++ functions
/**
- * Inputs to the higher ordered function.
+ * Arguments of the higher ordered function.
*/
- def inputs: Seq[Expression]
+ def arguments: Seq[Expression]
/**
- * All inputs have been resolved. This means that the types and
nullabilty of (most of) the
+ * All arguments have been resolved. This means that the types and
nullabilty of (most of) the
* lambda function arguments is known, and that we can start binding the
lambda functions.
*/
- lazy val inputResolved: Boolean = inputs.forall(_.resolved)
+ lazy val argumentsResolved: Boolean = arguments.forall(_.resolved)
+
+ /**
+ * Checks the argument data types, returns `TypeCheckResult.success` if
it's valid,
+ * or returns a `TypeCheckResult` with an error message if invalid.
+ * Note: it's not valid to call this method until `argumentsResolved ==
true`.
+ */
+ def checkArgumentDataTypes(): TypeCheckResult
--- End diff --
how about
```
def argumentTypes: Seq[AbstractDataType]
lazy val argumentsResolved: Boolean = arguments.forall(_.resolved) &&
arguments.zip(argumentTypes).forall {
case (arg, dt) => dt.accept(arg.dataType)
}
```
The `ExpectsInputTypes` can be unchanged and we can still use it to report
type unmatch.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]