This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new 1e31947  [SPARK-55066] Use `Spark 3.5.8` for Spark 3 integration tests
1e31947 is described below

commit 1e319470eae51143a0e7f25d90eddfd24609444e
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Jan 16 21:07:41 2026 +0900

    [SPARK-55066] Use `Spark 3.5.8` for Spark 3 integration tests
    
    ### What changes were proposed in this pull request?
    
    This PR aims to use `Spark 3.5.8` for Spark 3 integration tests.
    
    ### Why are the changes needed?
    
    Since Apache Spark 3.5.8 is released, we had better use this stabler 
version than 3.5.7.
    - https://github.com/apache/spark/releases/tag/v3.5.8
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #281 from dongjoon-hyun/SPARK-55066.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .github/workflows/build_and_test.yml | 16 ++++++++--------
 1 file changed, 8 insertions(+), 8 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index f6cabc4..b97610c 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -180,11 +180,11 @@ jobs:
       run: swift test --filter NOTHING -c release
     - name: Test
       run: |
-        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download
-        tar xvfz spark-3.5.7-bin-hadoop3.tgz && rm spark-3.5.7-bin-hadoop3.tgz
-        mv spark-3.5.7-bin-hadoop3 /tmp/spark
+        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.8/spark-3.5.8-bin-hadoop3.tgz?action=download
+        tar xvfz spark-3.5.8-bin-hadoop3.tgz && rm spark-3.5.8-bin-hadoop3.tgz
+        mv spark-3.5.8-bin-hadoop3 /tmp/spark
         cd /tmp/spark/sbin
-        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.7
+        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.8
         cd -
         swift test --no-parallel -c release
 
@@ -205,11 +205,11 @@ jobs:
       run: swift test --filter NOTHING -c release
     - name: Test
       run: |
-        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download
-        tar xvfz spark-3.5.7-bin-hadoop3.tgz && rm spark-3.5.7-bin-hadoop3.tgz
-        mv spark-3.5.7-bin-hadoop3 /tmp/spark
+        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.8/spark-3.5.8-bin-hadoop3.tgz?action=download
+        tar xvfz spark-3.5.8-bin-hadoop3.tgz && rm spark-3.5.8-bin-hadoop3.tgz
+        mv spark-3.5.8-bin-hadoop3 /tmp/spark
         cd /tmp/spark/sbin
-        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.7,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.10.1
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
+        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.8,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.10.1
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
         cd -
         swift test --filter DataFrameWriterV2Tests -c release
         swift test --filter IcebergTest -c release


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to