LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266148725


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   run the following commands:
   ```
   build/sbt clean
   build/sbt "connect-client-jvm/test" -Phive
   ```
   
   there is 1 test failed:
   
   ```
   [info] - call_function *** FAILED *** (150 milliseconds)
   [info]   org.apache.spark.SparkException: [CANNOT_LOAD_FUNCTION_CLASS] 
Cannot load class test.org.apache.spark.sql.MyDoubleSum when registering the 
function `spark_catalog`.`default`.`custom_sum`, please make sure it is on the 
classpath.
   [info]   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   [info]   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   [info]   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   [info]   at 
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:80)
   [info]   at 
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:133)
   [info]   at 
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:150)
   [info]   at 
org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   [info]   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   [info]   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   [info]   at 
org.apache.spark.sql.ClientE2ETestSuite.$anonfun$new$139(ClientE2ETestSuite.scala:1175)
   [info]   at 
org.apache.spark.sql.connect.client.util.RemoteSparkSession.$anonfun$test$1(RemoteSparkSession.scala:246)
   ```
   
   @beliefer we should add `(LocalProject("sql") / Test / 
Keys.`package`).value` to 
   
   
https://github.com/apache/spark/blob/228b5dbfd7688a8efa7135d9ec7b00b71e41a38a/project/SparkBuild.scala#L875-L878
   
   then the sql test jar will build&package before testing.
   
   For maven, let me do more check



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to