BryanCutler commented on a change in pull request #26045: [SPARK-29367][DOC] Add compatibility note for Arrow 0.15.0 to SQL guide URL: https://github.com/apache/spark/pull/26045#discussion_r332790684
########## File path: docs/sql-pyspark-pandas-with-arrow.md ########## @@ -219,3 +219,14 @@ Note that a standard UDF (non-Pandas) will load timestamp data as Python datetim different than a Pandas timestamp. It is recommended to use Pandas time series functionality when working with timestamps in `pandas_udf`s to get the best performance, see [here](https://pandas.pydata.org/pandas-docs/stable/timeseries.html) for details. + +### Compatibiliy Setting for PyArrow >= 0.15.0 and Spark 2.3.x, 2.4.x + +Since Arrow 0.15.0, a change in the binary IPC format requires an environment variable to be set in +Spark so that PySpark maintain compatibility with versions on PyArrow 0.15.0 and above. The following can be added to `conf/spark-env.sh` to use the legacy IPC format: + +``` +ARROW_PRE_0_15_IPC_FORMAT=1 +``` + +This will instruct PyArrow >= 0.15.0 to use the legacy IPC format with the older Arrow Java that is in Spark 2.3.x and 2.4.x. Review comment: >Hm, @BryanCutler do you target to upgrade and also increase minimum versions of PyArrow at SPARK-29376 (we upgrade in JVM one too; therefore, we don't need to set the environment variable in Spark 3.0)? So once we upgrade Arrow Java to 0.15.0, it is not necessary to set the env var and will work with older versions of pyarrow also. Because of this, I don't think it's necessary to increase the minimum version right now. I do think we will have Arrow 1.0 before Spark 3.0, so it would make sense to set that as the minimum version. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
