This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-4.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.1 by this push:
     new bc427f724c45 [SPARK-54517][PYTHON][TESTS] Added utility decorators for 
Spark Connect parity tests
bc427f724c45 is described below

commit bc427f724c4513bd883ef4665a4f10b52ebabe44
Author: Takuya Ueshin <[email protected]>
AuthorDate: Tue Nov 25 22:19:38 2025 -0800

    [SPARK-54517][PYTHON][TESTS] Added utility decorators for Spark Connect 
parity tests
    
    ### What changes were proposed in this pull request?
    
    This PR aims to add utility decorators in Spark Connect cross version tests.
    
    - `skip_if_server_version_is`
    - `skip_if_server_version_is_greater_than_or_equal_to`
    
    ### Why are the changes needed?
    
    This is a forward-port from #53222.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    N/A
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #53223 from ueshin/issues/SPARK-54517/decorators.
    
    Authored-by: Takuya Ueshin <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit d9dcb5c4dbb5548c773d147fcb6e7bd54a33ef07)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/testing/connectutils.py | 30 +++++++++++++++++++++++++++---
 1 file changed, 27 insertions(+), 3 deletions(-)

diff --git a/python/pyspark/testing/connectutils.py 
b/python/pyspark/testing/connectutils.py
index 8539e16f03fc..f53ac77b24b6 100644
--- a/python/pyspark/testing/connectutils.py
+++ b/python/pyspark/testing/connectutils.py
@@ -21,11 +21,10 @@ import functools
 import unittest
 import uuid
 import contextlib
+from typing import Callable, Optional
 
 from pyspark import Row, SparkConf
-from pyspark.util import is_remote_only
-from pyspark.testing.utils import PySparkErrorTestUtils
-from pyspark import Row, SparkConf
+from pyspark.loose_version import LooseVersion
 from pyspark.util import is_remote_only
 from pyspark.testing.utils import (
     have_pandas,
@@ -306,3 +305,28 @@ class ReusedMixedTestCase(ReusedConnectTestCase, 
SQLTestUtils):
                 yield
 
         return _both_conf()
+
+
+def skip_if_server_version_is(
+    cond: Callable[[LooseVersion], bool], reason: Optional[str] = None
+) -> Callable[[...], ...]:
+    def decorator(f: Callable) -> Callable:
+        @functools.wraps(f)
+        def wrapper(self, *args, **kwargs):
+            version = self.spark.version
+            if cond(LooseVersion(version)):
+                raise unittest.SkipTest(
+                    f"Skipping test {f.__name__} because server version is 
{version}"
+                    + (f" ({reason})" if reason else "")
+                )
+            return f(self, *args, **kwargs)
+
+        return wrapper
+
+    return decorator
+
+
+def skip_if_server_version_is_greater_than_or_equal_to(
+    version: str, reason: Optional[str] = None
+) -> Callable[[...], ...]:
+    return skip_if_server_version_is(lambda v: v >= LooseVersion(version), 
reason)


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to