This is an automated email from the ASF dual-hosted git repository.

yangjie01 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6fe6828b59ac [SPARK-46302][TESTS] Skip maven daily testing as ivy uses 
some corrupted cache jar files
6fe6828b59ac is described below

commit 6fe6828b59ac3e4cfe8b953945cc8c064cc5a133
Author: panbingkun <[email protected]>
AuthorDate: Wed Dec 20 13:50:48 2023 +0800

    [SPARK-46302][TESTS] Skip maven daily testing as ivy uses some corrupted 
cache jar files
    
    ### What changes were proposed in this pull request?
    The pr aims to skip maven daily testing as ivy uses some corrupted cache 
jar files.
    This is a temporary workaround solution.
    After SPARK-46400 and applying it to the already released Spark version, we 
should remove this logic.
    
    ### Why are the changes needed?
    Fix maven daily testing GA.
    In our Maven daily testing, some UTs failed due to some corrupt jars in 
maven repo:
    
https://github.com/apache/spark/actions/runs/7019155617/job/19095991788#step:9:27263
    <img width="993" alt="image" 
src="https://github.com/apache/spark/assets/15246973/def1e393-d0af-47b4-86d8-8c53a57d1ae0";>
    <img width="742" alt="image" 
src="https://github.com/apache/spark/assets/15246973/0955c447-20db-4157-8795-b279edfabd43";>
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    - Pass GA.
    - Manually check.
    <img width="986" alt="image" 
src="https://github.com/apache/spark/assets/15246973/c4ee3df3-f04e-438e-ac4d-6c6e50daf787";>
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No.
    
    Closes #44208 from panbingkun/try_fix_maven_test.
    
    Authored-by: panbingkun <[email protected]>
    Signed-off-by: yangjie01 <[email protected]>
---
 .github/workflows/build_maven.yml                                | 5 +++++
 .../apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala | 9 ++++++++-
 2 files changed, 13 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/build_maven.yml 
b/.github/workflows/build_maven.yml
index 7a01c136fce9..d43366fa86e7 100644
--- a/.github/workflows/build_maven.yml
+++ b/.github/workflows/build_maven.yml
@@ -30,3 +30,8 @@ jobs:
     name: Run
     uses: ./.github/workflows/maven_test.yml
     if: github.repository == 'apache/spark'
+    with:
+      envs: >-
+        {
+          "SKIP_SPARK_RELEASE_VERSIONS": "3.3.4,3.4.2,3.5.0"
+        }
diff --git 
a/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala
 
b/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala
index ee2e64bc1905..50cf4017bd1e 100644
--- 
a/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala
+++ 
b/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala
@@ -249,6 +249,12 @@ class HiveExternalCatalogVersionsSuite extends 
SparkSubmitTestUtils {
 }
 
 object PROCESS_TABLES extends QueryTest with SQLTestUtils {
+  // TODO In SPARK-46302, the env SKIP_SPARK_RELEASE_VERSIONS has been added to
+  //  allow Maven tests to skip problematic release versions.
+  //  Related issues will be fixed in SPARK-46400, and testing will be resumed
+  //  after the fixed Spark 3.x version is released.
+  private val skipReleaseVersions =
+    sys.env.getOrElse("SKIP_SPARK_RELEASE_VERSIONS", "").split(",").toSet
   val isPythonVersionAvailable = TestUtils.isPythonVersionAvailable
   val releaseMirror = sys.env.getOrElse("SPARK_RELEASE_MIRROR",
     "https://dist.apache.org/repos/dist/release";)
@@ -263,7 +269,8 @@ object PROCESS_TABLES extends QueryTest with SQLTestUtils {
         .filter(_.contains("""<a href="spark-"""))
         .filterNot(_.contains("preview"))
         .map("""<a 
href="spark-(\d.\d.\d)/">""".r.findFirstMatchIn(_).get.group(1))
-        .filter(_ < org.apache.spark.SPARK_VERSION).toImmutableArraySeq
+        .filter(_ < org.apache.spark.SPARK_VERSION)
+        .filterNot(skipReleaseVersions.contains).toImmutableArraySeq
     } catch {
       // Do not throw exception during object initialization.
       case NonFatal(_) => Nil


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to