This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new b9c4756  [SPARK-54897] Upgrade `Iceberg` to 1.10.1 in integration tests
b9c4756 is described below

commit b9c47565aa1c9996ababcec1715139079a07875e
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Mon Jan 5 07:11:30 2026 +0900

    [SPARK-54897] Upgrade `Iceberg` to 1.10.1 in integration tests
    
    ### What changes were proposed in this pull request?
    
    This PR aims to upgrade `Iceberg` to 1.10.1 in integration tests.
    
    ### Why are the changes needed?
    
    To use the latest and bug fixed version:
    - https://github.com/apache/iceberg/releases/tag/apache-iceberg-1.10.1
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #274 from dongjoon-hyun/SPARK-54897.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .github/workflows/build_and_test.yml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 9f4c241..8575286 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -209,7 +209,7 @@ jobs:
         tar xvfz spark-3.5.7-bin-hadoop3.tgz && rm spark-3.5.7-bin-hadoop3.tgz
         mv spark-3.5.7-bin-hadoop3 /tmp/spark
         cd /tmp/spark/sbin
-        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.7,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.10.0
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
+        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.7,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.10.1
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
         cd -
         swift test --filter DataFrameWriterV2Tests -c release
         swift test --filter IcebergTest -c release
@@ -235,7 +235,7 @@ jobs:
         tar xvfz spark-4.0.1-bin-hadoop3.tgz && rm spark-4.0.1-bin-hadoop3.tgz
         mv spark-4.0.1-bin-hadoop3 /tmp/spark
         cd /tmp/spark/sbin
-        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.13:4.0.1,org.apache.iceberg:iceberg-spark-runtime-4.0_2.13:1.10.0
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
+        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.13:4.0.1,org.apache.iceberg:iceberg-spark-runtime-4.0_2.13:1.10.1
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
         cd -
         swift test --filter DataFrameWriterV2Tests -c release
         swift test --filter IcebergTest -c release


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to