itholic commented on code in PR #41514:
URL: https://github.com/apache/spark/pull/41514#discussion_r1228833815


##########
python/pyspark/pandas/data_type_ops/num_ops.py:
##########
@@ -213,37 +214,31 @@ def abs(self, operand: IndexOpsLike) -> IndexOpsLike:
             F.abs(operand.spark.column), field=operand._internal.data_fields[0]
         )
 
+    def eq(self, left: IndexOpsLike, right: Any) -> SeriesOrIndex:
+        # We can directly use `super().eq` when given object is list, tuple, 
dict or set.
+        if not isinstance(right, IndexOpsMixin) and is_list_like(right):
+            return super().eq(left, right)
+        return pyspark_column_op("__eq__", left, right, fillna=False)
+
+    def ne(self, left: IndexOpsLike, right: Any) -> SeriesOrIndex:
+        _sanitize_list_like(right)
+        return pyspark_column_op("__ne__", left, right, fillna=True)

Review Comment:
   For `ne`:
   ```python
   >>> pser = pd.Series([1.0, 2.0, np.nan])
   >>> psser = ps.from_pandas(pser)
   >>> pser.ne(pser)
   0    False
   1    False
   2     True
   dtype: bool
   >>> psser.ne(psser)
   0    False
   1    False
   2     None
   dtype: bool
   ```
   
   We expect `True` for non-equal case, but it returns `None` in Spark Connect. 
So we cast `None` to `True` for `ne`.
   
   For `eq`:
   ```python
   >>> pser = pd.Series([1.0, 2.0, np.nan])
   >>> psser = ps.from_pandas(pser)
   >>> pser.eq(pser)
   0     True
   1     True
   2    False
   dtype: bool
   >>> psser.eq(psser)
   0     True
   1     True
   2     None
   dtype: bool
   ```
   
   We expect `False` for non-equal case, but it returns `None` in Spark 
Connect. So we cast `None` to `False` for `eq`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to