xkrogen opened a new pull request #30122:
URL: https://github.com/apache/spark/pull/30122


   
   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: 
https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: 
https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., 
'[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a 
faster review.
     7. If you want to add a new configuration, please read the guideline first 
for naming configurations in
        
'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
   -->
   
   ### What changes were proposed in this pull request?
   This PR changes `HiveExternalCatalogVersionsSuite` to, by default, use a 
standard temporary directory to store the Spark binaries that it localizes. It 
additionally adds a new System property, `spark.test.cache-dir`, which can be 
used to define a static location into which the Spark binary will be localized 
to allow for sharing between test executions. If the System property is used, 
the downloaded binaries won't be deleted after the test runs.
   
   ### Why are the changes needed?
   In SPARK-22356 (PR #19579), the `sparkTestingDir` used by 
`HiveExternalCatalogVersionsSuite` became hard-coded to enable re-use of the 
downloaded Spark tarball between test executions:
   ```
     // For local test, you can set `sparkTestingDir` to a static value like 
`/tmp/test-spark`, to
     // avoid downloading Spark of different versions in each run.
     private val sparkTestingDir = new File("/tmp/test-spark")
   ```
   However this doesn't work, since it gets deleted every time:
   ```
     override def afterAll(): Unit = {
       try {
         Utils.deleteRecursively(wareHousePath)
         Utils.deleteRecursively(tmpDataDir)
         Utils.deleteRecursively(sparkTestingDir)
       } finally {
         super.afterAll()
       }
     }
   ```
   
   It's bad that we're hard-coding to a `/tmp` directory, as in some cases this 
is not the proper place to store temporary files. We're not currently making 
any good use of it.
   
   ### Does this PR introduce _any_ user-facing change?
   Developer-facing changes only, as this is in a test.
   
   ### How was this patch tested?
   The test continues to execute as expected.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to