This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 2cb72322033d [SPARK-50184][SPARK-48997][SS][TESTS] Make test case `: 
maintenance threads with exceptions unload only themselves` use a self-cleaning 
`rootLocation`
2cb72322033d is described below

commit 2cb72322033dc9cd19fb7ed522cae95c7cb8252d
Author: yangjie01 <[email protected]>
AuthorDate: Fri Nov 1 15:53:03 2024 +0900

    [SPARK-50184][SPARK-48997][SS][TESTS] Make test case `: maintenance threads 
with exceptions unload only themselves` use a self-cleaning `rootLocation`
    
    ### What changes were proposed in this pull request?
    This allows the PR for the test case `SPARK-48997: maintenance threads with 
exceptions unload only themselves` to use a self-cleaning `rootLocation` to 
avoid leaving residual directories after test execution.
    
    ### Why are the changes needed?
    Avoid leaving residual directories after testing.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    - Pass GitHub Actions
    - Manual Check:
    
    ```
    build/mvn clean install -DskipTests -pl sql/core -am
    build/mvn test -pl sql/core -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.execution.streaming.state.StateStoreSuite
    ```
    
    **Before**
    
    We can observe a residual test data directory(Because they are all empty 
directories, git status cannot detect this directory):
    
    ```
    ls -R sql/core/spark-48997
    0
    
    sql/core/spark-48997/0:
    0   1   2
    
    sql/core/spark-48997/0/0:
    
    sql/core/spark-48997/0/1:
    
    sql/core/spark-48997/0/2:
    ```
    
    **After**
    
    Confirm that the root location 
`./sql/core/target/tmp/spark-aa7a0f70-ffc6-4853-a99c-1e798ec0398c/spark-48997` 
has been cleaned up.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #48717 from LuciferYang/SPARK-48997-test.
    
    Lead-authored-by: yangjie01 <[email protected]>
    Co-authored-by: YangJie <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 .../spark/sql/execution/streaming/state/StateStoreSuite.scala      | 7 ++++---
 1 file changed, 4 insertions(+), 3 deletions(-)

diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/StateStoreSuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/StateStoreSuite.scala
index 031f5a8b8764..47dd77f1bb9f 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/StateStoreSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/StateStoreSuite.scala
@@ -1627,13 +1627,14 @@ abstract class StateStoreSuiteBase[ProviderClass <: 
StateStoreProvider]
 
     withSpark(new SparkContext(conf)) { sc =>
       withCoordinatorRef(sc) { _ =>
+        val rootLocation = 
s"${Utils.createTempDir().getAbsolutePath}/spark-48997"
         // 0 and 1's maintenance will fail
         val provider0Id =
-          StateStoreProviderId(StateStoreId("spark-48997", 0, 0), 
UUID.randomUUID)
+          StateStoreProviderId(StateStoreId(rootLocation, 0, 0), 
UUID.randomUUID)
         val provider1Id =
-          StateStoreProviderId(StateStoreId("spark-48997", 0, 1), 
UUID.randomUUID)
+          StateStoreProviderId(StateStoreId(rootLocation, 0, 1), 
UUID.randomUUID)
         val provider2Id =
-          StateStoreProviderId(StateStoreId("spark-48997", 0, 2), 
UUID.randomUUID)
+          StateStoreProviderId(StateStoreId(rootLocation, 0, 2), 
UUID.randomUUID)
 
         // Create provider 2 first to start the maintenance task + pool
         StateStore.get(


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to