Github user javierluraschi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22913#discussion_r230833894
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowUtils.scala 
---
    @@ -71,6 +71,7 @@ object ArrowUtils {
         case d: ArrowType.Decimal => DecimalType(d.getPrecision, d.getScale)
         case date: ArrowType.Date if date.getUnit == DateUnit.DAY => DateType
         case ts: ArrowType.Timestamp if ts.getUnit == TimeUnit.MICROSECOND => 
TimestampType
    +    case date: ArrowType.Date if date.getUnit == DateUnit.MILLISECOND => 
TimestampType
    --- End diff --
    
    Right... let me keep just the original PR commit for now. So yes, we could 
map to date but we would loose time, so the best mapping we have is to 
`timestamp`.
    
    Let me check back in the `arrow` project as to why `POSIXct` is being 
mapped to `Date` not `TimeStamp` (see 
[arrow/r/src/array.cpp#L461](https://github.com/apache/arrow/blob/3a1dd3feb9ca09f92168f46fcfc06c01305df3ec/r/src/array.cpp#L461)),
 if we can change that then I would agree we don't need this change.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to