amaliujia commented on code in PR #38270:
URL: https://github.com/apache/spark/pull/38270#discussion_r996416342
##########
connector/connect/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -193,20 +189,10 @@ class SparkConnectPlanner(plan: proto.Relation, session:
SparkSession) {
*/
private def transformScalarFunction(fun:
proto.Expression.UnresolvedFunction): Expression = {
val funName = fun.getPartsList.asScala.mkString(".")
- funName match {
- case "gt" =>
- assert(fun.getArgumentsCount == 2, "`gt` function must have two
arguments.")
- expressions.GreaterThan(
- transformExpression(fun.getArguments(0)),
- transformExpression(fun.getArguments(1)))
- case "eq" =>
- assert(fun.getArgumentsCount == 2, "`eq` function must have two
arguments.")
- expressions.EqualTo(
- transformExpression(fun.getArguments(0)),
- transformExpression(fun.getArguments(1)))
- case _ =>
- lookupFunction(funName,
fun.getArgumentsList.asScala.map(transformExpression).toSeq)
- }
+ UnresolvedFunction(
+ Seq(funName),
Review Comment:
Testing a negative case is great.
I was also thinking a test here:
```
val funName = fun.getPartsList.asScala.mkString(".")
UnresolvedFunction(
Seq(funName),
...
```
Basically it first concatenates the function name by `.` then passing that
to a Seq which become Seq with one element, but why not just don't do the
concatenation, or we have to do the concatenation?
Having a test case here will better demonstrate the intention of this
implementation.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]