zero323 commented on PR #37369:
URL: https://github.com/apache/spark/pull/37369#issuecomment-1202088146
> is there some method to enforce the input type checking? @zero323
There exist some 3rd party attempts, but as far as I'm aware none is
particularly popular and well maintained (`pydantic` included
`validate_arguments` on provisional basis, but I haven't seen it used in the
wild).
Runtime checking is officially out-of-scope for core `typing`
(`@typing.runtime_checkable` is a bit different beast) so it wouldn't expect
any solution emerging there.
Personally, I'd say that the current behavior is acceptable ‒ there is no
significant divergence compared to Pandas:
```python
>>> df = pd.DataFrame({'A': [1, 2, 3, 4], 'B': [3, 4, 5, 6]}, columns=['A',
'B'])
>>> df.groupby(['A'])['B'].nsmallest(5.5)
Traceback (most recent call last):
...
TypeError: cannot do positional indexing on Int64Index with these indexers
[True] of type bool
```
and the error message is meaningful enough.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]