This is an automated email from the ASF dual-hosted git repository.

hvanhovell pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 104a546f6d9 [SPARK-42334][CONNECT][BUILD] Make sure connect client 
assembly and sql package is built before running client tests - SBT
104a546f6d9 is described below

commit 104a546f6d9e1d2d7c63ced9cbbdb588110252e9
Author: yangjie01 <yangji...@baidu.com>
AuthorDate: Sat Feb 4 21:42:38 2023 -0400

    [SPARK-42334][CONNECT][BUILD] Make sure connect client assembly and sql 
package is built before running client tests - SBT
    
    ### What changes were proposed in this pull request?
    `build/sbt clean "connect-client-jvm/test"` will fail after SPARK-42172 
merged, so this pr make sure sbt assembles the connect client assembly jar and 
sbt package the sql jar before we run `CompatibilitySuite` of client module.
    
    ### Why are the changes needed?
    Similar as SPARK-42284, it makes it easier to develop and test the JVM 
client for Spark Connect.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    Manually tested.
    
    ```
    build/sbt clean "connect-client-jvm/test"
    ```
    
    **Before**
    
    ```
    [info] - compatibility MiMa tests *** FAILED *** (34 milliseconds)
    [info]   java.lang.AssertionError: assertion failed: Failed to find the jar 
inside folder: 
/Users/yangjie01/SourceCode/git/spark-mine-sbt/connector/connect/client/jvm/target
    [info]   at scala.Predef$.assert(Predef.scala:223)
    [info]   at 
org.apache.spark.sql.connect.client.util.IntegrationTestUtils$.findJar(IntegrationTestUtils.scala:67)
    [info]   at 
org.apache.spark.sql.connect.client.CompatibilitySuite.clientJar$lzycompute(CompatibilitySuite.scala:57)
    [info]   at 
org.apache.spark.sql.connect.client.CompatibilitySuite.clientJar(CompatibilitySuite.scala:53)
    [info]   at 
org.apache.spark.sql.connect.client.CompatibilitySuite.$anonfun$new$1(CompatibilitySuite.scala:69)
    ....
    [info] *** 2 TESTS FAILED ***
    [error] Failed tests:
    [error]         org.apache.spark.sql.connect.client.CompatibilitySuite
    [error] (connect-client-jvm / Test / test) sbt.TestsFailedException: Tests 
unsuccessful
    [error] Total time: 196 s (03:16), completed 2023-2-3 17:20:40
    ```
    
    **After**
    
    ```
    [info] Run completed in 20 seconds, 652 milliseconds.
    [info] Total number of tests run: 31
    [info] Suites: completed 6, aborted 0
    [info] Tests: succeeded 31, failed 0, canceled 0, ignored 0, pending 0
    [info] All tests passed.
    [success] Total time: 230 s (03:50), completed 2023-2-3 17:35:37
    ```
    
    Closes #39874 from LuciferYang/make-test.
    
    Authored-by: yangjie01 <yangji...@baidu.com>
    Signed-off-by: Herman van Hovell <her...@databricks.com>
---
 project/SparkBuild.scala | 9 ++++++++-
 1 file changed, 8 insertions(+), 1 deletion(-)

diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 4eb17e88d4d..a4c8d62dd6e 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -829,6 +829,7 @@ object SparkConnect {
 
 object SparkConnectClient {
   import BuildCommons.protoVersion
+  val buildTestDeps = TaskKey[Unit]("buildTestDeps", "Build needed 
dependencies for test.")
 
   lazy val settings = Seq(
     // For some reason the resolution from the imported Maven build does not 
work for some
@@ -851,8 +852,14 @@ object SparkConnectClient {
       )
     },
 
+    buildTestDeps := {
+      (LocalProject("sql") / Compile / Keys.`package`).value
+      (LocalProject("connect") / assembly).value
+      (LocalProject("connect-client-jvm") / assembly).value
+    },
+
     // Make sure the connect server assembly jar is available for testing.
-    test := ((Test / test) dependsOn (LocalProject("connect") / 
assembly)).value,
+    test := ((Test / test) dependsOn (buildTestDeps)).value,
 
     (assembly / test) := { },
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to