zhengruifeng commented on PR #38956:
URL: https://github.com/apache/spark/pull/38956#issuecomment-1340714677
it keep failing in the invocation of constructor, I print the constructor as
well as expressions in the error message, and they looks fine
```
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC
that terminated with:
status = StatusCode.UNKNOWN
details = "[FAILED_FUNCTION_CALL] Failed preparing of the function
`when_1, constructor:public
org_apache_spark_sql_catalyst_expressions_CaseWhen(scala_collection_Seq),
exprs:class org_apache_spark_sql_catalyst_expressions_EqualTo:(cast(a#6 as
bigint) = 0) , class org_apache_spark_sql_catalyst_expressions_Literal:1_0 ,
class org_apache_spark_sql_catalyst_expressions_Literal:2_0, error:null` for
call. Please, double check function's arguments."
debug_error_string =
"{"created":"@1670408082.289473000","description":"Error received from peer
ipv6:[::1]:15002","file":"src/core/lib/surface/call.cc","file_line":1064,"grpc_message":"[FAILED_FUNCTION_CALL]
Failed preparing of the function `when_1, constructor:public
org_apache_spark_sql_catalyst_expressions_CaseWhen(scala_collection_Seq),
exprs:class org_apache_spark_sql_catalyst_expressions_EqualTo:(cast(a#6 as
bigint) = 0) , class org_apache_spark_sql_catalyst_expressions_Literal:1_0 ,
class org_apache_spark_sql_catalyst_expressions_Literal:2_0, error:null` for
call. Please, double check function's arguments.","grpc_status":2}"
```
I also check in the shell:
```
scala> import org.apache.spark.sql.catalyst.expressions._
import org.apache.spark.sql.catalyst.expressions._
scala> val casewhen = CaseWhen(Seq.empty, None)
casewhen: org.apache.spark.sql.catalyst.expressions.CaseWhen = CASE END
scala> val varargCtor =
casewhen.getClass.getConstructors().find(_.getParameterTypes.toSeq ==
Seq(classOf[Seq[_]]))
varargCtor: Option[java.lang.reflect.Constructor[_]] = Some(public
org.apache.spark.sql.catalyst.expressions.CaseWhen(scala.collection.Seq))
scala> val expressions = Seq((col("a").cast("long") === 0).expr,
lit(1.0).expr, lit(2.0).expr)
expressions: Seq[org.apache.spark.sql.catalyst.expressions.Expression] =
List((cast('a as bigint) = 0), 1.0, 2.0)
scala> varargCtor.get.newInstance(expressions)
res10: Any = CASE WHEN (cast('a as bigint) = 0) THEN 1.0 ELSE 2.0 END
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]