[
https://issues.apache.org/jira/browse/SPARK-33214?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Wenchen Fan resolved SPARK-33214.
---------------------------------
Fix Version/s: 3.1.0
Resolution: Fixed
Issue resolved by pull request 30122
[https://github.com/apache/spark/pull/30122]
> HiveExternalCatalogVersionsSuite shouldn't use or delete hard-coded /tmp
> directory
> -----------------------------------------------------------------------------------
>
> Key: SPARK-33214
> URL: https://issues.apache.org/jira/browse/SPARK-33214
> Project: Spark
> Issue Type: Bug
> Components: SQL, Tests
> Affects Versions: 3.0.1
> Reporter: Erik Krogen
> Assignee: Erik Krogen
> Priority: Major
> Fix For: 3.1.0
>
>
> In SPARK-22356, the {{sparkTestingDir}} used by
> {{HiveExternalCatalogVersionsSuite}} became hard-coded to enable re-use of
> the downloaded Spark tarball between test executions:
> {code}
> // For local test, you can set `sparkTestingDir` to a static value like
> `/tmp/test-spark`, to
> // avoid downloading Spark of different versions in each run.
> private val sparkTestingDir = new File("/tmp/test-spark")
> {code}
> However this doesn't work, since it gets deleted every time:
> {code}
> override def afterAll(): Unit = {
> try {
> Utils.deleteRecursively(wareHousePath)
> Utils.deleteRecursively(tmpDataDir)
> Utils.deleteRecursively(sparkTestingDir)
> } finally {
> super.afterAll()
> }
> }
> {code}
> It's bad that we're hard-coding to a {{/tmp}} directory, as in some cases
> this is not the proper place to store temporary files. We're not currently
> making any good use of it.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]