cloud-fan commented on a change in pull request #32764:
URL: https://github.com/apache/spark/pull/32764#discussion_r646309055
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
##########
@@ -2169,12 +2169,29 @@ class Analyzer(override val catalogManager:
CatalogManager)
unbound, arguments, unsupported)
}
+ if (bound.inputTypes().length != arguments.length) {
+ throw
QueryCompilationErrors.v2FunctionInvalidInputTypeLengthError(
+ bound, arguments)
+ }
+
+ val castedArguments = arguments.zip(bound.inputTypes()).map
{ case (arg, ty) =>
+ if (arg.dataType != ty) {
+ if (Cast.canCast(arg.dataType, ty)) {
Review comment:
This is a bit hacky to do type coercion manually, instead of using the
existing type coercion framework. e.g. how do you support ANSI type coercion
here? See `AnsiTypeCoercion` for more details.
My proposal is to let `Invoke`/`StaticInvoke` extend
`ImplicitCastInputTypes`, and leverage the existing type coercion framework.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]