This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 11e2e4245df [SPARK-44481][CONNECT][PYTHON] Make pyspark.sql.is_remote 
an API
11e2e4245df is described below

commit 11e2e4245df128a159adb0d2a080c005e4860274
Author: Hyukjin Kwon <[email protected]>
AuthorDate: Thu Jul 20 09:02:38 2023 +0900

    [SPARK-44481][CONNECT][PYTHON] Make pyspark.sql.is_remote an API
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to expose and document `pyspark.sql.is_remote` as an API.
    
    ### Why are the changes needed?
    
    For the end users to be able to do if-else, e.g., for dispatching the code 
path to the legacy mode or connect mode.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, it exposes a method as an API.
    
    ### How was this patch tested?
    
    Manually built and checked the documentation.
    
    Closes #42072 from HyukjinKwon/SPARK-44481.
    
    Authored-by: Hyukjin Kwon <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 .../source/reference/pyspark.sql/spark_session.rst    |  2 +-
 python/pyspark/sql/__init__.py                        |  2 ++
 python/pyspark/sql/utils.py                           | 19 +++++++++++++++++++
 3 files changed, 22 insertions(+), 1 deletion(-)

diff --git a/python/docs/source/reference/pyspark.sql/spark_session.rst 
b/python/docs/source/reference/pyspark.sql/spark_session.rst
index 9867a9cd121..6a46db7576b 100644
--- a/python/docs/source/reference/pyspark.sql/spark_session.rst
+++ b/python/docs/source/reference/pyspark.sql/spark_session.rst
@@ -50,7 +50,7 @@ See also :class:`SparkSession`.
     SparkSession.udf
     SparkSession.udtf
     SparkSession.version
-
+    is_remote
 
 Spark Connect Only
 ------------------
diff --git a/python/pyspark/sql/__init__.py b/python/pyspark/sql/__init__.py
index d0d69488fa4..dd82b037a6b 100644
--- a/python/pyspark/sql/__init__.py
+++ b/python/pyspark/sql/__init__.py
@@ -50,6 +50,7 @@ from pyspark.sql.observation import Observation
 from pyspark.sql.readwriter import DataFrameReader, DataFrameWriter, 
DataFrameWriterV2
 from pyspark.sql.window import Window, WindowSpec
 from pyspark.sql.pandas.group_ops import PandasCogroupedOps
+from pyspark.sql.utils import is_remote
 
 
 __all__ = [
@@ -72,4 +73,5 @@ __all__ = [
     "DataFrameWriter",
     "DataFrameWriterV2",
     "PandasCogroupedOps",
+    "is_remote",
 ]
diff --git a/python/pyspark/sql/utils.py b/python/pyspark/sql/utils.py
index 608ed7e9ac9..8b520ed653f 100644
--- a/python/pyspark/sql/utils.py
+++ b/python/pyspark/sql/utils.py
@@ -146,6 +146,25 @@ def is_timestamp_ntz_preferred() -> bool:
 def is_remote() -> bool:
     """
     Returns if the current running environment is for Spark Connect.
+
+    .. versionadded:: 4.0.0
+
+    Notes
+    -----
+    This will only return ``True`` if there is a remote session running.
+    Otherwise, it returns ``False``.
+
+    This API is unstable, and for developers.
+
+    Returns
+    -------
+    bool
+
+    Examples
+    --------
+    >>> from pyspark.sql import is_remote
+    >>> is_remote()
+    False
     """
     return "SPARK_CONNECT_MODE_ENABLED" in os.environ
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to