ueshin opened a new pull request, #50816:
URL: https://github.com/apache/spark/pull/50816
### What changes were proposed in this pull request?
Blocks pandas API on Spark on ANSI mode by default.
To force it work on ANSI mode, set pandas-on-spark option
`compute.fail_on_ansi_mode` to `False`.
```py
>>> spark.conf.get('spark.sql.ansi.enabled')
'true'
>>> import pyspark.pandas as ps
>>> ps.range(1)
Traceback (most recent call last):
...
pyspark.errors.exceptions.base.UnsupportedOperationException:
[PANDAS_API_ON_SPARK_FAIL_ON_ANSI_MODE] Pandas API on Spark does not properly
work on ANSI mode.
Please set a Spark config 'spark.sql.ansi.enabled' to `false`.
Alternatively set a pandas-on-spark option 'compute.fail_on_ansi_mode' to
`False` to force it to work, although it can cause unexpected behavior.
>>> ps.set_option('compute.fail_on_ansi_mode', False)
>>> ps.range(1)
...: PandasAPIOnSparkAdviceWarning: The config 'spark.sql.ansi.enabled' is
set to True. This can cause unexpected behavior from pandas API on Spark since
pandas API on Spark follows the behavior of pandas, not SQL.
warnings.warn(message, PandasAPIOnSparkAdviceWarning)
id
0 0
>>> spark.conf.set('spark.sql.ansi.enabled', False)
>>> ps.range(1)
id
0 0
```
### Why are the changes needed?
From Spark 4.0, ANSI mode is enabled by default, but pandas API on Spark
won't work properly.
To avoid implicitly working wrongly, block it and
### Does this PR introduce _any_ user-facing change?
Yes, pandas API on Spark will fail on ANSI mode by default.
### How was this patch tested?
Manually.
### Was this patch authored or co-authored using generative AI tooling?
No.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]