This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new 9c63317  [SPARK-52183] Update `SparkSQLRepl` example to show up to 10k 
rows
9c63317 is described below

commit 9c63317438ce36bfa03d36f2743b5a7667689d48
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri May 16 05:51:57 2025 -0700

    [SPARK-52183] Update `SparkSQLRepl` example to show up to 10k rows
    
    ### What changes were proposed in this pull request?
    
    This PR aims to update `SparkSQLRepl` example to show up to 10k rows.
    
    ### Why are the changes needed?
    
    Currently, `SparkSQLRepl` uses the `show()` with the default parameters. 
Although we cannot handle large-numbers of rows due to `grpc_max_message_size`, 
we had better have more reasonable default value.
    
    ```SQL
    spark-sql (default)> SELECT * FROM RANGE(21);
    +---+
    | id|
    +---+
    |  0|
    |...|
    | 19|
    +---+
    only showing top 20 rows
    Time taken: 118 ms
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    This is an example.
    
    ### How was this patch tested?
    
    Manual test.
    
    ```SQL
    $ swift run
    spark-sql (default)> SELECT * FROM RANGE(10001);
    ...
    only showing top 10000 rows
    Time taken: 142 ms
    ```
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #159 from dongjoon-hyun/SPARK-52183.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 Sources/SparkSQLRepl/main.swift | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/Sources/SparkSQLRepl/main.swift b/Sources/SparkSQLRepl/main.swift
index 6c45ae0..a5c6c5d 100644
--- a/Sources/SparkSQLRepl/main.swift
+++ b/Sources/SparkSQLRepl/main.swift
@@ -46,7 +46,7 @@ while isRunning {
       break
     default:
       do {
-        try await spark.time(spark.sql(String(match.1)).show)
+        try await spark.time({ try await 
spark.sql(String(match.1)).show(10000, false) })
       } catch {
         print("Error: \(error)")
       }


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to