[
https://issues.apache.org/jira/browse/SPARK-22966?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon updated SPARK-22966:
---------------------------------
Labels: bulk-closed (was: )
> Spark SQL should handle Python UDFs that return a datetime.date or
> datetime.datetime
> ------------------------------------------------------------------------------------
>
> Key: SPARK-22966
> URL: https://issues.apache.org/jira/browse/SPARK-22966
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.2.0, 2.2.1
> Reporter: Kris Mok
> Priority: Major
> Labels: bulk-closed
>
> Currently, in Spark SQL, if a Python UDF returns a {{datetime.date}} (which
> should correspond to a Spark SQL {{date}} type) or {{datetime.datetime}}
> (which should correspond to a Spark SQL {{timestamp}} type), it gets
> unpickled into a {{java.util.Calendar}} which Spark SQL doesn't understand
> internally, and will thus give incorrect results.
> e.g.
> {code:none}
> >>> import datetime
> >>> from pyspark.sql import *
> >>> py_date = udf(datetime.date)
> >>> spark.range(1).select(py_date(lit(2017), lit(10), lit(30)) ==
> >>> lit(datetime.date(2017, 10, 30))).show()
> +----------------------------------------+
> |(date(2017, 10, 30) = DATE '2017-10-30')|
> +----------------------------------------+
> | false|
> +----------------------------------------+
> {code}
> (changing the definition of {{py_date}} from {{udf(date)}} to {{udf(date,
> 'date')}} doesn't work either)
> We should correctly handle Python UDFs that return objects of such types.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]