This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new c4a7588cbd5 [SPARK-46053][TESTS] Support `DEDICATED_JVM_SBT_TESTS`
c4a7588cbd5 is described below

commit c4a7588cbd5febc50054253da679198e741025a6
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Wed Nov 22 01:34:37 2023 -0800

    [SPARK-46053][TESTS] Support `DEDICATED_JVM_SBT_TESTS`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to support `DEDICATED_JVM_SBT_TESTS` environment variable to 
accept a class to run in a dedicated JVM Scala test.
    
    ### Why are the changes needed?
    
    - `SERIAL_SBT_TESTS` environment variable allows users to run SBT test in a 
serial way.
    - By default, SBT test runs in a parallel way in two default groups, 
`default_test_group` and `hive_execution_test_group`, with the following static 
list of exception.
    
    
https://github.com/apache/spark/blob/06635e25f170e61f6cfe53232d001993ec7d376d/project/SparkBuild.scala#L522-L547
    
    After this PR, the users can add their own classes on demand manner without 
touching `SparkBuild.scala`.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Manual review.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #43957 from dongjoon-hyun/SPARK-46053.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 project/SparkBuild.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 5d2a9cfef98..e1db7b506c5 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -544,7 +544,7 @@ object SparkParallelTestGrouping {
     
"org.apache.spark.sql.streaming.RocksDBStateStoreStreamingAggregationSuite",
     "org.apache.spark.shuffle.KubernetesLocalDiskShuffleDataIOSuite",
     "org.apache.spark.sql.hive.HiveScalaReflectionSuite"
-  )
+  ) ++ 
sys.env.get("DEDICATED_JVM_SBT_TESTS").map(_.split(",")).getOrElse(Array.empty).toSet
 
   private val DEFAULT_TEST_GROUP = "default_test_group"
   private val HIVE_EXECUTION_TEST_GROUP = "hive_execution_test_group"


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to