ueshin opened a new pull request, #48664:
URL: https://github.com/apache/spark/pull/48664

   ### What changes were proposed in this pull request?
   
   Adds the following DataFrame APIs for subqueries to Spark Classic.
   
   - `scalar()`
   - `exists()`
   
   Also, add `outer()` to `Column` to specify outer references.
   
   #### Examples:
   
   For the following tables `l` and `r`:
   
   ```py
   >>> spark.table("l").printSchema()
   root
    |-- a: long (nullable = true)
    |-- b: double (nullable = true)
   
   >>> spark.table("r").printSchema()
   root
    |-- c: long (nullable = true)
    |-- d: double (nullable = true)
   ```
   
   ```py
   from pyspark.sql import functions as sf
   
   spark.table("l").where(
       sf.col("b")
       < (
           spark.table("r")
           .where(sf.col("a").outer() == sf.col("c"))
           .select(sf.max("d"))
           .scalar()
       )
   )
   
   spark.table("l").select(
       "a",
       (
           spark.table("l")
           .where(sf.col("a") == sf.col("a").outer())
           .select(sf.sum("b"))
           .scalar()
           .alias("sum_b")
       ),
   )
   
   spark.table("l").where(
       spark.table("r").where(sf.col("a").outer() == sf.col("c")).exists()
   )
   ```
   
   ### Why are the changes needed?
   
   Subquery APIs are missing in DataFrame API.
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, new DataFrame APIs for subqueries will be available.
   
   ### How was this patch tested?
   
   Added the related tests.
   
   ### Was this patch authored or co-authored using generative AI tooling?
   
   No.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to