cloud-fan commented on a change in pull request #27937:
URL: https://github.com/apache/spark/pull/27937#discussion_r422778641
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/expressions/UserDefinedFunction.scala
##########
@@ -93,7 +93,7 @@ sealed abstract class UserDefinedFunction {
private[spark] case class SparkUserDefinedFunction(
f: AnyRef,
dataType: DataType,
- inputSchemas: Seq[Option[ScalaReflection.Schema]],
Review comment:
`inputSchemas` is composable but it contains too little information,
that's why the Spark UDF was so limited before.
Our goal is to make Spark UDF powerful enough so that people don't need to
use internal APIs to build inhouse UDFs. But you are right that the support is
not completed. @Ngone51 Can you take a closer look and see how to make it
completed?
BTW, if you do need to keep your inhouse UDFs for a while, there is a way to
create `ExpressionEncoder` from `Seq[DataType]`, by calling `RowEncoder.apply`.
It only supports standard Spark external types, i.e. `Row`, not `case class`,
which is the same as older versions of Spark.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]