gaogaotiantian commented on code in PR #54146:
URL: https://github.com/apache/spark/pull/54146#discussion_r2766275059


##########
python/pyspark/pandas/tests/data_type_ops/testing_utils.py:
##########
@@ -219,3 +220,6 @@ def check_extension(self, left, right):
         pandas versions. Please refer to 
https://github.com/pandas-dev/pandas/issues/39410.
         """
         self.assert_eq(left, right)
+
+    def ignore_null(self, col):
+        return LooseVersion(pd.__version__) >= LooseVersion("3.0") and col == 
"decimal_nan"

Review Comment:
   Is `decimal_nan` the only case where this happens? I think we can have this 
matter any time we do some calculation that results in a null-like value.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to