HyukjinKwon commented on a change in pull request #23657:
[SPARK-26566][PYTHON][SQL] Upgrade Apache Arrow to version 0.12.0
URL: https://github.com/apache/spark/pull/23657#discussion_r251230103
##########
File path: python/pyspark/sql/types.py
##########
@@ -1688,7 +1688,10 @@ def _check_series_convert_date(series, data_type):
:param series: pandas.Series
:param data_type: a Spark data type for the series
"""
- if type(data_type) == DateType:
+ import pyarrow
+ from distutils.version import LooseVersion
+ # As of Arrow 0.12.0, date_as_objects is True by default, see ARROW-3910
+ if LooseVersion(pyarrow.__version__) < LooseVersion("0.12.0") and
type(data_type) == DateType:
Review comment:
BTW, I was thinking about targeting to upgrade minimum PyArrow version in
Spark 3.0.0 since the codes are being complicated for those if-else.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]