Yikun opened a new pull request, #36142: URL: https://github.com/apache/spark/pull/36142
### What changes were proposed in this pull request? Move value type check before is_list_like check. This is one of fixes to make pandas on spark work with pandas 1.4+. ### Why are the changes needed? Since Pandas v1.4.0, pandas are using [`not (hasattr(obj, "ndim") and obj.ndim == 0)`](https://github.com/pandas-dev/pandas/blob/d228a781b4d7be6c753cee940ce3ee692e97697d/pandas/_libs/lib.pyx#L1114) to exclude zero-dimensional duck-arrays, effectively scalars: https://github.com/pandas-dev/pandas/commit/d228a781b4d7be6c753cee940ce3ee692e97697d But for a `Column` instance, it has some problems, df doesn't allow apply `not` and raise a execption `ValueError: Cannot convert column into bool: please use '&' for 'and', '|' for 'or', '~' for 'not' when building DataFrame boolean expressions.` So we need to move the value type check before is_list_like check. ### Does this PR introduce _any_ user-facing change? <!-- Note that it means *any* user-facing change including all aspects such as the documentation fix. If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master. If no, write 'No'. --> ### How was this patch tested? UT passed. ``` OpsOnDiffFramesEnabledTest.test_frame_iloc_setitem OpsOnDiffFramesEnabledTest.test_series_iloc_setitem IndexingTest.test_series_iloc_setitem IndexingTest.test_frame_iloc_setitem ``` passed on 1.4.x pandas. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
