This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 751a81b23bd3 [SPARK-46751][PYTHON][TESTS] Skip test_datasource if 
PyArrow is not installed
751a81b23bd3 is described below

commit 751a81b23bd35ebe6a3fc1f328405c5d57291a0c
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Wed Jan 17 18:29:08 2024 -0800

    [SPARK-46751][PYTHON][TESTS] Skip test_datasource if PyArrow is not 
installed
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to skip `test_datasource` if PyArrow is not installed 
because it requires `mapInArrow` that needs PyArrow.
    
    ### Why are the changes needed?
    
    To make the build pass with the env that does not have PyArrow installed.
    
    Currently scheduled job fails (with PyPy3): 
https://github.com/apache/spark/actions/runs/7557652490/job/20577472214
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, test-only.
    
    ### How was this patch tested?
    
    Scheduled jobs should test it out. I also manually tested it.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #44776 from HyukjinKwon/SPARK-46751.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 python/pyspark/sql/tests/test_python_datasource.py | 5 +++++
 1 file changed, 5 insertions(+)

diff --git a/python/pyspark/sql/tests/test_python_datasource.py 
b/python/pyspark/sql/tests/test_python_datasource.py
index 79414cb7ed69..6ba4b68b02ba 100644
--- a/python/pyspark/sql/tests/test_python_datasource.py
+++ b/python/pyspark/sql/tests/test_python_datasource.py
@@ -29,11 +29,16 @@ from pyspark.sql.datasource import (
     CaseInsensitiveDict,
 )
 from pyspark.sql.types import Row, StructType
+from pyspark.testing.sqlutils import (
+    have_pyarrow,
+    pyarrow_requirement_message,
+)
 from pyspark.testing import assertDataFrameEqual
 from pyspark.testing.sqlutils import ReusedSQLTestCase
 from pyspark.testing.utils import SPARK_HOME
 
 
+@unittest.skipIf(not have_pyarrow, pyarrow_requirement_message)
 class BasePythonDataSourceTestsMixin:
     def test_basic_data_source_class(self):
         class MyDataSource(DataSource):


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to