This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new f81aba2  [SPARK-51718] Update `README.md` with Spark 4.0.0 RC3
f81aba2 is described below

commit f81aba2fe5380951e2b54e2d34e789ce7865a5b6
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Apr 4 14:56:57 2025 +0900

    [SPARK-51718] Update `README.md` with Spark 4.0.0 RC3
    
    ### What changes were proposed in this pull request?
    
    This PR aims to make `README.md` up-to-date with the following.
    - Apache Spark 4.0.0 RC3
    - New APIs like `filter`, `cache`, `read`, `write`, `mode`, `orc`
    
    ### Why are the changes needed?
    
    To provide more examples.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, this is a documentation-only change.
    
    ### How was this patch tested?
    
    Manual review. Also, the newly updated example is testable in the following 
repository.
    - https://github.com/dongjoon-hyun/spark-connect-swift-app
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #41 from dongjoon-hyun/SPARK-51718.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 README.md | 20 ++++++++++++++++----
 1 file changed, 16 insertions(+), 4 deletions(-)

diff --git a/README.md b/README.md
index feb1f0e..f3d734d 100644
--- a/README.md
+++ b/README.md
@@ -6,10 +6,10 @@
 
 This is an experimental Swift library to show how to connect to a remote 
Apache Spark Connect Server and run SQL statements to manipulate remote data.
 
-So far, this library project is tracking the upstream changes like the [Apache 
Spark](https://spark.apache.org) 4.0.0 RC2 release and [Apache 
Arrow](https://arrow.apache.org) project's Swift-support.
+So far, this library project is tracking the upstream changes like the [Apache 
Spark](https://spark.apache.org) 4.0.0 RC3 release and [Apache 
Arrow](https://arrow.apache.org) project's Swift-support.
 
 ## Requirement
-- [Apache Spark 4.0.0 RC2 (March 
2025)](https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc2-bin/)
+- [Apache Spark 4.0.0 RC3 (March 
2025)](https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc3-bin/)
 - [Swift 6.0 (2024)](https://swift.org)
 - [gRPC Swift 2.1 (March 
2025)](https://github.com/grpc/grpc-swift/releases/tag/2.1.2)
 - [gRPC Swift Protobuf 1.1 (March 
2025)](https://github.com/grpc/grpc-swift-protobuf/releases/tag/1.1.0)
@@ -59,7 +59,7 @@ print("Connected to Apache Spark \(await spark.version) 
Server")
 
 let statements = [
   "DROP TABLE IF EXISTS t",
-  "CREATE TABLE IF NOT EXISTS t(a INT)",
+  "CREATE TABLE IF NOT EXISTS t(a INT) USING ORC",
   "INSERT INTO t VALUES (1), (2), (3)",
 ]
 
@@ -68,7 +68,10 @@ for s in statements {
   _ = try await spark.sql(s).count()
 }
 print("SELECT * FROM t")
-try await spark.sql("SELECT * FROM t").show()
+try await spark.sql("SELECT * FROM t").cache().show()
+
+try await spark.range(10).filter("id % 2 == 
0").write.mode("overwrite").orc("/tmp/orc")
+try await spark.read.orc("/tmp/orc").show()
 
 await spark.stop()
 ```
@@ -90,6 +93,15 @@ SELECT * FROM t
 | 1 |
 | 3 |
 +---+
++----+
+| id |
++----+
+| 2  |
+| 6  |
+| 0  |
+| 8  |
+| 4  |
++----+
 ```
 
 You can find this example in the following repository.


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to