This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-4.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.0 by this push:
     new 2f59ff61951f [SPARK-51176][PYTHON][CONNECT] Meet consistency for 
unexpected errors PySpark Connect <> Classic
2f59ff61951f is described below

commit 2f59ff61951fbf18b523ab8589cac6e135692674
Author: Haejoon Lee <[email protected]>
AuthorDate: Wed Feb 19 08:43:14 2025 +0900

    [SPARK-51176][PYTHON][CONNECT] Meet consistency for unexpected errors 
PySpark Connect <> Classic
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to add `UnknownException` for Spark Connect Python client 
to meet consistency for unexpected errors PySpark Connect <> Classic
    
    ### Why are the changes needed?
    
    To meet consistency between PySpark Connect and Classic. Also, 
`UnknownErrors` might more clearer than `SparkConnectGrpcException` for 
unexpected errors to end users.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No API changes, but the user-facing error improvement
    
    ### How was this patch tested?
    
    The existing CI should pass
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No
    
    Closes #49926 from itholic/unexpected_error.
    
    Lead-authored-by: Haejoon Lee <[email protected]>
    Co-authored-by: Hyukjin Kwon <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
    (cherry picked from commit dc3fb50d587d8e0a984bc87b9f0b97b8583c5dd7)
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 python/pyspark/errors/exceptions/connect.py | 35 +++++++++++++++++++++++++++--
 1 file changed, 33 insertions(+), 2 deletions(-)

diff --git a/python/pyspark/errors/exceptions/connect.py 
b/python/pyspark/errors/exceptions/connect.py
index 0da809473a01..f87494b72426 100644
--- a/python/pyspark/errors/exceptions/connect.py
+++ b/python/pyspark/errors/exceptions/connect.py
@@ -38,6 +38,7 @@ from pyspark.errors.exceptions.base import (
     QueryContextType,
     StreamingPythonRunnerInitializationException as 
BaseStreamingPythonRunnerInitException,
     PickleException as BasePickleException,
+    UnknownException as BaseUnknownException,
 )
 
 if TYPE_CHECKING:
@@ -114,8 +115,8 @@ def convert_exception(
                 contexts=contexts,
             )
 
-    # Return SparkConnectGrpcException if there is no matched exception class
-    return SparkConnectGrpcException(
+    # Return UnknownException if there is no matched exception class
+    return UnknownException(
         message,
         reason=info.reason,
         messageParameters=message_parameters,
@@ -217,6 +218,36 @@ class SparkConnectGrpcException(SparkConnectException):
         return self.getMessage()
 
 
+class UnknownException(SparkConnectGrpcException, BaseUnknownException):
+    """
+    Exception for unmapped errors in Spark Connect.
+    This class is functionally identical to SparkConnectGrpcException but has 
a different name
+    for consistency.
+    """
+
+    def __init__(
+        self,
+        message: Optional[str] = None,
+        errorClass: Optional[str] = None,
+        messageParameters: Optional[Dict[str, str]] = None,
+        reason: Optional[str] = None,
+        sql_state: Optional[str] = None,
+        server_stacktrace: Optional[str] = None,
+        display_server_stacktrace: bool = False,
+        contexts: Optional[List[BaseQueryContext]] = None,
+    ) -> None:
+        super().__init__(
+            message=message,
+            errorClass=errorClass,
+            messageParameters=messageParameters,
+            reason=reason,
+            sql_state=sql_state,
+            server_stacktrace=server_stacktrace,
+            display_server_stacktrace=display_server_stacktrace,
+            contexts=contexts,
+        )
+
+
 class AnalysisException(SparkConnectGrpcException, BaseAnalysisException):
     """
     Failed to analyze a SQL query plan, thrown from Spark Connect.


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to