itholic commented on code in PR #40624:
URL: https://github.com/apache/spark/pull/40624#discussion_r1156590284
##########
python/pyspark/errors/error_classes.py:
##########
@@ -59,6 +64,21 @@
"Argument `<arg_name>` should be a bool, dict, float, int or str, got
<arg_type>."
Review Comment:
Yes. Since type-related error classes are increasing, it seems like it's
time for refactoring.
Maybe there are two method in my mind for refactoring as below:
- We can create a main error class called such as `TYPE_MISMATCH` and
convert all type-related errors to sub-error classes of `TYPE_MISMATCH`.
- Alternatively, as you suggested, we can create a `TYPE_MISMATCH` error
class and add a parameter like `allowed_types` to manage all type-related
errors at once under the single error class.
Which method do you think is more reasonable?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]