This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 5c075c3e577f [SPARK-50667][PYTHON][TESTS] Make `jinja2` optional in 
PySpark Tests
5c075c3e577f is described below

commit 5c075c3e577f05f4d2806a360524c121280e820f
Author: Ruifeng Zheng <[email protected]>
AuthorDate: Thu Dec 26 12:40:28 2024 +0900

    [SPARK-50667][PYTHON][TESTS] Make `jinja2` optional in PySpark Tests
    
    ### What changes were proposed in this pull request?
    Make `jinja2` optional in PySpark Tests
    
    ### Why are the changes needed?
    `jinja2` is an optional dependency of `pandas`
    
    https://pypi.org/pypi/pandas/2.2.0/json
    
    ```
    'jinja2>=3.1.2; extra == "output-formatting"'
    ```
    
    It is not a mandatory requirement of pyspark, so PySpark tests should 
succeed even it is not installed
    
    ### Does this PR introduce _any_ user-facing change?
    no, test-only
    
    ### How was this patch tested?
    manually test after uninstalling it
    
    ### Was this patch authored or co-authored using generative AI tooling?
    no
    
    Closes #49288 from zhengruifeng/optional_jinja2.
    
    Authored-by: Ruifeng Zheng <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 python/pyspark/pandas/frame.py                              | 2 +-
 python/pyspark/pandas/tests/io/test_dataframe_conversion.py | 2 ++
 python/pyspark/testing/utils.py                             | 3 +++
 3 files changed, 6 insertions(+), 1 deletion(-)

diff --git a/python/pyspark/pandas/frame.py b/python/pyspark/pandas/frame.py
index f315d59a4fe9..35b96543b9eb 100644
--- a/python/pyspark/pandas/frame.py
+++ b/python/pyspark/pandas/frame.py
@@ -2632,7 +2632,7 @@ defaultdict(<class 'list'>, {'col..., 'col...})]
         ...                    'mask': ['red', 'purple'],
         ...                    'weapon': ['sai', 'bo staff']},
         ...                   columns=['name', 'mask', 'weapon'])
-        >>> print(df.to_latex(index=False)) # doctest: +NORMALIZE_WHITESPACE
+        >>> print(df.to_latex(index=False))  # doctest: +SKIP
         \begin{tabular}{lll}
         \toprule
               name &    mask &    weapon \\
diff --git a/python/pyspark/pandas/tests/io/test_dataframe_conversion.py 
b/python/pyspark/pandas/tests/io/test_dataframe_conversion.py
index d4b03a855d38..7cb997153729 100644
--- a/python/pyspark/pandas/tests/io/test_dataframe_conversion.py
+++ b/python/pyspark/pandas/tests/io/test_dataframe_conversion.py
@@ -26,6 +26,7 @@ import pandas as pd
 from pyspark import pandas as ps
 from pyspark.testing.pandasutils import PandasOnSparkTestCase, TestUtils
 from pyspark.testing.sqlutils import SQLTestUtils
+from pyspark.testing.utils import have_jinja2, jinja2_requirement_message
 
 
 class DataFrameConversionMixin:
@@ -199,6 +200,7 @@ class DataFrameConversionMixin:
             psdf.to_clipboard(sep=";", index=False), pdf.to_clipboard(sep=";", 
index=False)
         )
 
+    @unittest.skipIf(not have_jinja2, jinja2_requirement_message)
     def test_to_latex(self):
         pdf = self.pdf
         psdf = self.psdf
diff --git a/python/pyspark/testing/utils.py b/python/pyspark/testing/utils.py
index c38cd928d584..a89add74ca8f 100644
--- a/python/pyspark/testing/utils.py
+++ b/python/pyspark/testing/utils.py
@@ -94,6 +94,9 @@ graphviz_requirement_message = None if have_graphviz else "No 
module named 'grap
 have_flameprof = have_package("flameprof")
 flameprof_requirement_message = None if have_flameprof else "No module named 
'flameprof'"
 
+have_jinja2 = have_package("jinja2")
+jinja2_requirement_message = None if have_jinja2 else "No module named 
'jinja2'"
+
 pandas_requirement_message = None
 try:
     from pyspark.sql.pandas.utils import require_minimum_pandas_version


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to