gaogaotiantian commented on code in PR #54410:
URL: https://github.com/apache/spark/pull/54410#discussion_r2842819828
##########
python/pyspark/tests/upstream/pyarrow/test_pyarrow_type_coercion.py:
##########
@@ -507,27 +507,27 @@ def test_pandas_instances_coercion(self):
),
(pd.Series([int64_min, int64_max], dtype="int64"), pa.int64(),
[int64_min, int64_max]),
(pd.Series([int8_min, 0, int8_max], dtype="int8"), pa.int64(),
[int8_min, 0, int8_max]),
- # NaN to int → None (pandas-specific behavior)
+ # NaN to int -> None (pandas-specific behavior)
(pd.Series([nan, 1.0], dtype="float64"), pa.int64(), [None, 1]),
]
self._run_coercion_tests_with_values(numpy_cases)
- # Special float values (NaN/Inf) - type only
+ # Special float values (NaN/Inf) -> type only
for data, target in [
(pd.Series([nan, 1.0], dtype="float64"), pa.float64()),
(pd.Series([inf, neg_inf], dtype="float64"), pa.float64()),
(pd.Series([nan], dtype="float64"), pa.float32()),
]:
self.assertEqual(pa.array(data, type=target).type, target)
- # numpy int → decimal128 does NOT work
+ # numpy int -> decimal128 does NOT work
with self.assertRaises(pa.ArrowInvalid):
pa.array(pd.Series([1, 2, 3], dtype="int64"),
type=pa.decimal128(10, 0))
# ==== 3.2 Nullable Extension Types ====
# (data, target_type, expected_values)
nullable_cases = [
- # Int types → float
Review Comment:
So this is actually not enforced by ruff. The added ruff checker only checks
for "ambiguous unicode usage" like the quote I mentioned above. This fix is
done by myself. It's actually added pretty recently and I believe it's because
LLMs like to generate icons like this.
I don't think having such characters in the comments is horrible, and in
some case it might actually be helpful. But unicode characters may have issues
on some IDEs/machines/editors and it's not worth it to do `→` vs `->`. I don't
even know how to type `→` by myself :) .
That being said, this enforcement will not block any unicode usages in the
future - people can still do that. This specific change is a side effect when
I'm trying to clean up unicode character usages in this PR.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]