Keith Bourgoin created SPARK-20787:
--
Summary: PySpark can't handle datetimes before 1900
Key: SPARK-20787
URL: https://issues.apache.org/jira/browse/SPARK-20787
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-16394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15907383#comment-15907383
]
Keith Bourgoin commented on SPARK-16394:
PS: The above is from Spark 2.1.0 running on an Ubuntu
[
https://issues.apache.org/jira/browse/SPARK-16394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15906266#comment-15906266
]
Keith Bourgoin commented on SPARK-16394:
We've been having the same issue. To illustrate, this is
[
https://issues.apache.org/jira/browse/SPARK-19439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15854330#comment-15854330
]
Keith Bourgoin commented on SPARK-19439:
SPARK-10915 refers to making it possible to write UDAFs
[
https://issues.apache.org/jira/browse/SPARK-19439?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Keith Bourgoin updated SPARK-19439:
---
Description:
When trying to import a Scala UDAF using registerJavaFunction, I get this
Keith Bourgoin created SPARK-19439:
--
Summary: PySpark's registerJavaFunction Should Support UDAFs
Key: SPARK-19439
URL: https://issues.apache.org/jira/browse/SPARK-19439
Project: Spark