Github user javierluraschi commented on the issue:
https://github.com/apache/spark/pull/22913
Yes, I believe that's the case. This change
https://github.com/apache/arrow/pull/2887 maps R bindings from `POSIXct` to
`timestamp` instead of `date`, looks like there is not need
Github user javierluraschi closed the pull request at:
https://github.com/apache/spark/pull/22913
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user javierluraschi commented on a diff in the pull request:
https://github.com/apache/spark/pull/22913#discussion_r230833894
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowUtils.scala
---
@@ -71,6 +71,7 @@ object ArrowUtils {
case d
Github user javierluraschi commented on a diff in the pull request:
https://github.com/apache/spark/pull/22913#discussion_r230607581
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowUtils.scala
---
@@ -71,6 +71,7 @@ object ArrowUtils {
case d
GitHub user javierluraschi opened a pull request:
https://github.com/apache/spark/pull/22913
[SPARK-25902][SQL] Add support for dates with milliseconds in Apache Arrow
bindings
Currently, the Apache Arrow bindings for Java only support `Date` with the
metric set to `DateUnit.DAY