This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 9bc273ee0dad [SPARK-47136][CORE][TESTS] Fix `MavenUtilsSuite` to use
`MavenUtils.resolveMavenCoordinates` properly
9bc273ee0dad is described below
commit 9bc273ee0daddef3a0d453ba6311e996bc56830d
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Feb 22 15:26:01 2024 -0800
[SPARK-47136][CORE][TESTS] Fix `MavenUtilsSuite` to use
`MavenUtils.resolveMavenCoordinates` properly
### What changes were proposed in this pull request?
This PR aims the following.
1. Fix `MavenUtilsSuite` to use `MavenUtils.resolveMavenCoordinates`
properly by using `ivyPath` parameter of `MavenUtils.loadIvySettings` method
consistently.
2. Make all test cases isolated by adding `beforeEach` and `afterEach`
instead of a single `beforeAll`
### Why are the changes needed?
1. `MavenUtils` assumes to set the following together inside if it receives
`ivyPath`.
https://github.com/apache/spark/blob/9debaeaa5a079a73605cddb90b1a77274c5284d3/common/utils/src/main/scala/org/apache/spark/util/MavenUtils.scala#L337-L342
3. `MavenUtilsSuite` uses `tempIvyPath` for all
`MavenUtils.resolveMavenCoordinates` except one test case.
https://github.com/apache/spark/blob/9debaeaa5a079a73605cddb90b1a77274c5284d3/common/utils/src/test/scala/org/apache/spark/util/MavenUtilsSuite.scala#L175-L175
4. The following is the missed case and this PR aims to fix.
https://github.com/apache/spark/blob/9debaeaa5a079a73605cddb90b1a77274c5284d3/common/utils/src/test/scala/org/apache/spark/util/MavenUtilsSuite.scala#L253
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs.
```
$ build/sbt "common-utils/testOnly *MavenUtilsSuite"
...
[info] MavenUtilsSuite:
[info] - incorrect maven coordinate throws error (9 milliseconds)
[info] - create repo resolvers (19 milliseconds)
[info] - create additional resolvers (7 milliseconds)
:: loading settings :: url =
jar:file:/Users/dongjoon/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/org/apache/ivy/ivy/2.5.1/ivy-2.5.1.jar!/org/apache/ivy/core/settings/ivysettings.xml
[info] - add dependencies works correctly (29 milliseconds)
[info] - excludes works correctly (2 milliseconds)
[info] - ivy path works correctly (661 milliseconds)
[info] - search for artifact at local repositories (405 milliseconds)
[info] - dependency not found throws RuntimeException (198 milliseconds)
:: loading settings :: url =
jar:file:/Users/dongjoon/Library/Caches/Coursier/v1/https/repo1.maven.org/maven2/org/apache/ivy/ivy/2.5.1/ivy-2.5.1.jar!/org/apache/ivy/core/settings/ivysettings.xml
[info] - neglects Spark and Spark's dependencies (388 milliseconds)
[info] - exclude dependencies end to end (385 milliseconds)
:: loading settings :: file =
/Users/dongjoon/APACHE/spark-merge/target/tmp/ivy-9aa3863e-9dba-4002-996b-5e86b2f1281f/ivysettings.xml
[info] - load ivy settings file (103 milliseconds)
[info] - SPARK-10878: test resolution files cleaned after resolving
artifact (70 milliseconds)
Spark was unable to load org/apache/spark/log4j2-defaults.properties
[info] - SPARK-34624: should ignore non-jar dependencies (247 milliseconds)
[info] Run completed in 3 seconds, 16 milliseconds.
[info] Total number of tests run: 13
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 13, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 3 s, completed Feb 22, 2024, 2:21:18 PM
```
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #45220 from dongjoon-hyun/SPARK-47136.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../scala/org/apache/spark/util/MavenUtilsSuite.scala | 16 ++++++++++------
1 file changed, 10 insertions(+), 6 deletions(-)
diff --git
a/common/utils/src/test/scala/org/apache/spark/util/MavenUtilsSuite.scala
b/common/utils/src/test/scala/org/apache/spark/util/MavenUtilsSuite.scala
index 642eca3cf933..d30422ca8dd5 100644
--- a/common/utils/src/test/scala/org/apache/spark/util/MavenUtilsSuite.scala
+++ b/common/utils/src/test/scala/org/apache/spark/util/MavenUtilsSuite.scala
@@ -28,14 +28,14 @@ import scala.jdk.CollectionConverters._
import org.apache.ivy.core.module.descriptor.MDArtifact
import org.apache.ivy.core.settings.IvySettings
import org.apache.ivy.plugins.resolver.{AbstractResolver, ChainResolver,
FileSystemResolver, IBiblioResolver}
-import org.scalatest.BeforeAndAfterAll
+import org.scalatest.BeforeAndAfterEach
import org.scalatest.funsuite.AnyFunSuite // scalastyle:ignore funsuite
import org.apache.spark.util.MavenUtils.MavenCoordinate
class MavenUtilsSuite
extends AnyFunSuite // scalastyle:ignore funsuite
- with BeforeAndAfterAll {
+ with BeforeAndAfterEach {
private var tempIvyPath: String = _
@@ -55,11 +55,16 @@ class MavenUtilsSuite
// scalastyle:on println
}
- override def beforeAll(): Unit = {
- super.beforeAll()
+ override def beforeEach(): Unit = {
+ super.beforeEach()
tempIvyPath = SparkFileUtils.createTempDir(namePrefix =
"ivy").getAbsolutePath()
}
+ override def afterEach(): Unit = {
+ SparkFileUtils.deleteRecursively(new File(tempIvyPath))
+ super.afterEach()
+ }
+
test("incorrect maven coordinate throws error") {
val coordinates = Seq("a:b: ", " :a:b", "a: :b", "a:b:", ":a:b", "a::b",
"::", "a:b", "a")
for (coordinate <- coordinates) {
@@ -250,8 +255,7 @@ class MavenUtilsSuite
val settingsFile = Paths.get(tempIvyPath, "ivysettings.xml")
Files.write(settingsFile, settingsText.getBytes(StandardCharsets.UTF_8))
- val settings = MavenUtils.loadIvySettings(settingsFile.toString, None,
None)
- settings.setDefaultIvyUserDir(new File(tempIvyPath)) // NOTE - can't set
this through file
+ val settings = MavenUtils.loadIvySettings(settingsFile.toString, None,
Some(tempIvyPath))
val testUtilSettings = new IvySettings
testUtilSettings.setDefaultIvyUserDir(new File(tempIvyPath))
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]