This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 61814876b26c [SPARK-43354][PYTHON][TESTS] Re-enable 
`test_create_dataframe_from_pandas_with_day_time_interval` in PyPy3.9
61814876b26c is described below

commit 61814876b26c6fef2dc8238b1aeb0594d9a24472
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Sep 12 20:49:16 2024 -0700

    [SPARK-43354][PYTHON][TESTS] Re-enable 
`test_create_dataframe_from_pandas_with_day_time_interval` in PyPy3.9
    
    ### What changes were proposed in this pull request?
    
    This PR aims to re-enable 
`test_create_dataframe_from_pandas_with_day_time_interval` in PyPy3.9.
    
    ### Why are the changes needed?
    
    This was disabled at PyPy3.8, but we dropped Python 3.8 support and the 
test passed with PyPy3.9.
    - #46228
    
    **BEFORE: Skipped with `Fails in PyPy Python 3.8, should enable.` message**
    ```
    $ python/run-tests.py --python-executables pypy3 --testnames 
pyspark.sql.tests.test_creation
    Running PySpark tests. Output is in 
/Users/dongjoon/APACHE/spark-merge/python/unit-tests.log
    Will test against the following Python executables: ['pypy3']
    Will test the following Python tests: ['pyspark.sql.tests.test_creation']
    pypy3 python_implementation is PyPy
    pypy3 version is: Python 3.9.19 (a2113ea87262, Apr 21 2024, 05:41:07)
    [PyPy 7.3.16 with GCC Apple LLVM 15.0.0 (clang-1500.1.0.2.5)]
    Starting test(pypy3): pyspark.sql.tests.test_creation (temp output: 
/Users/dongjoon/APACHE/spark-merge/python/target/58e26724-5c3e-4451-80f8-cabdb36f0901/pypy3__pyspark.sql.tests.test_creation__n448ay57.log)
    Finished test(pypy3): pyspark.sql.tests.test_creation (6s) ... 3 tests were 
skipped
    Tests passed in 6 seconds
    
    Skipped tests in pyspark.sql.tests.test_creation with pypy3:
        test_create_dataframe_from_pandas_with_day_time_interval 
(pyspark.sql.tests.test_creation.DataFrameCreationTests) ... skipped 'Fails in 
PyPy Python 3.8, should enable.'
        test_create_dataframe_required_pandas_not_found 
(pyspark.sql.tests.test_creation.DataFrameCreationTests) ... skipped 'Required 
Pandas was found.'
        test_schema_inference_from_pandas_with_dict 
(pyspark.sql.tests.test_creation.DataFrameCreationTests) ... skipped 
'[PACKAGE_NOT_INSTALLED] PyArrow >= 10.0.0 must be installed; however, it was 
not found.'
    ```
    
    **AFTER**
    ```
    $ python/run-tests.py --python-executables pypy3 --testnames 
pyspark.sql.tests.test_creation
    Running PySpark tests. Output is in 
/Users/dongjoon/APACHE/spark-merge/python/unit-tests.log
    Will test against the following Python executables: ['pypy3']
    Will test the following Python tests: ['pyspark.sql.tests.test_creation']
    pypy3 python_implementation is PyPy
    pypy3 version is: Python 3.9.19 (a2113ea87262, Apr 21 2024, 05:41:07)
    [PyPy 7.3.16 with GCC Apple LLVM 15.0.0 (clang-1500.1.0.2.5)]
    Starting test(pypy3): pyspark.sql.tests.test_creation (temp output: 
/Users/dongjoon/APACHE/spark-merge/python/target/1f0db01f-0beb-4ee2-817f-363eb2f2804d/pypy3__pyspark.sql.tests.test_creation__2w4gy9u1.log)
    Finished test(pypy3): pyspark.sql.tests.test_creation (13s) ... 2 tests 
were skipped
    Tests passed in 13 seconds
    
    Skipped tests in pyspark.sql.tests.test_creation with pypy3:
        test_create_dataframe_required_pandas_not_found 
(pyspark.sql.tests.test_creation.DataFrameCreationTests) ... skipped 'Required 
Pandas was found.'
        test_schema_inference_from_pandas_with_dict 
(pyspark.sql.tests.test_creation.DataFrameCreationTests) ... skipped 
'[PACKAGE_NOT_INSTALLED] PyArrow >= 10.0.0 must be installed; however, it was 
not found.'
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, this is a test only change.
    
    ### How was this patch tested?
    
    Manual tests with PyPy3.9.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #48097 from dongjoon-hyun/SPARK-43354.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/sql/tests/test_creation.py | 7 +------
 1 file changed, 1 insertion(+), 6 deletions(-)

diff --git a/python/pyspark/sql/tests/test_creation.py 
b/python/pyspark/sql/tests/test_creation.py
index dfe66cdd3edf..c6917aa234b4 100644
--- a/python/pyspark/sql/tests/test_creation.py
+++ b/python/pyspark/sql/tests/test_creation.py
@@ -15,7 +15,6 @@
 # limitations under the License.
 #
 
-import platform
 from decimal import Decimal
 import os
 import time
@@ -111,11 +110,7 @@ class DataFrameCreationTestsMixin:
                 os.environ["TZ"] = orig_env_tz
             time.tzset()
 
-    # TODO(SPARK-43354): Re-enable 
test_create_dataframe_from_pandas_with_day_time_interval
-    @unittest.skipIf(
-        "pypy" in platform.python_implementation().lower() or not have_pandas,
-        "Fails in PyPy Python 3.8, should enable.",
-    )
+    @unittest.skipIf(not have_pandas, pandas_requirement_message)  # type: 
ignore
     def test_create_dataframe_from_pandas_with_day_time_interval(self):
         # SPARK-37277: Test DayTimeIntervalType in createDataFrame without 
Arrow.
         import pandas as pd


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to