This is an automated email from the ASF dual-hosted git repository.
ruifengz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new c878dfde3c72 [SPARK-54575][PYTHON][TESTS] Reenable test
`SparkConnectCreationTests.test_with_none_and_nan`
c878dfde3c72 is described below
commit c878dfde3c721040231c8d6b637c9b852047a72d
Author: Ruifeng Zheng <[email protected]>
AuthorDate: Thu Dec 4 09:11:41 2025 +0800
[SPARK-54575][PYTHON][TESTS] Reenable test
`SparkConnectCreationTests.test_with_none_and_nan`
### What changes were proposed in this pull request?
There was a bug in create dataframe from ndarray containing NaN values:
NaN was incorrectly converted to Null when arrow-optimization is on, it
happened to be resolved in https://github.com/apache/spark/pull/53280
### Why are the changes needed?
for test coverage
### Does this PR introduce _any_ user-facing change?
no, test-only
### How was this patch tested?
ci
### Was this patch authored or co-authored using generative AI tooling?
no
Closes #53305 from zhengruifeng/reenable_test_with_none_and_nan.
Authored-by: Ruifeng Zheng <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
---
python/pyspark/sql/tests/connect/test_connect_creation.py | 1 -
1 file changed, 1 deletion(-)
diff --git a/python/pyspark/sql/tests/connect/test_connect_creation.py
b/python/pyspark/sql/tests/connect/test_connect_creation.py
index 0bd3e5b1e693..7be9959fdcb4 100644
--- a/python/pyspark/sql/tests/connect/test_connect_creation.py
+++ b/python/pyspark/sql/tests/connect/test_connect_creation.py
@@ -228,7 +228,6 @@ class SparkConnectCreationTests(ReusedMixedTestCase,
PandasOnSparkTestUtils):
self.assertEqual(sdf.schema, cdf.schema)
self.assert_eq(sdf.toPandas(), cdf.toPandas())
- @unittest.skip("TODO(SPARK-54575): Re-enable this test")
def test_with_none_and_nan(self):
# SPARK-41855: make createDataFrame support None and NaN
# SPARK-41814: test with eqNullSafe
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]