This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-4.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.0 by this push:
     new 62b0e4b9d651 [SPARK-51098][DOCS] Link exceptAll and subtract in Python 
DataFrame docs
62b0e4b9d651 is described below

commit 62b0e4b9d651fbed76d671e25607e15a6501eacc
Author: Nicholas Chammas <[email protected]>
AuthorDate: Thu Feb 6 09:53:43 2025 +0900

    [SPARK-51098][DOCS] Link exceptAll and subtract in Python DataFrame docs
    
    ### What changes were proposed in this pull request?
    
    Add references from `DataFrame.exceptAll` to `DataFrame.subtract` and 
vice-versa.
    
    ### Why are the changes needed?
    
    It's a small convenience for users reading the docs to easier see relevant 
and closely related methods.
    
    This matches existing practice. For example, `DataFrame.union` and 
`DataFrame.unionAll` reference each other in this way already.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, this is a doc change only.
    
    ### How was this patch tested?
    
    I haven't been able to test this as I am having trouble building the docs 
locally.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #49817 from nchammas/SPARK-51098-exceptAll-subtract.
    
    Authored-by: Nicholas Chammas <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
    (cherry picked from commit ad71780c5b4122b02a0a27890272235555ead048)
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 python/pyspark/sql/dataframe.py | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/python/pyspark/sql/dataframe.py b/python/pyspark/sql/dataframe.py
index 2d12704485ad..42c8386dd870 100644
--- a/python/pyspark/sql/dataframe.py
+++ b/python/pyspark/sql/dataframe.py
@@ -756,6 +756,10 @@ class DataFrame:
         -------
         :class:`DataFrame`
 
+        See Also
+        --------
+        DataFrame.subtract : Similar to `exceptAll`, but eliminates duplicates.
+
         Examples
         --------
         >>> df1 = spark.createDataFrame(
@@ -4762,6 +4766,10 @@ class DataFrame:
         -----
         This is equivalent to `EXCEPT DISTINCT` in SQL.
 
+        See Also
+        --------
+        DataFrame.exceptAll : Similar to `subtract`, but preserves duplicates.
+
         Examples
         --------
         Example 1: Subtracting two DataFrames with the same schema


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to