LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267584892


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the `–jars`  
is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable 
debugging logs, we will found that only the “Added JAR” logs related to 
“spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar” and 
“spark-connect_2.12-3.5.0-SNAPSHOT.jar” are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR 
file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar
 at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with 
timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR 
file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar
 at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with 
timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   
Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), 
(spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar),
 ...
   ```
   
   We should correct the usage of `–jars` to `--jars 
spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar`, then 
the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, 
@beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to