Github user ueshin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22913#discussion_r230628333
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowUtils.scala 
---
    @@ -71,6 +71,7 @@ object ArrowUtils {
         case d: ArrowType.Decimal => DecimalType(d.getPrecision, d.getScale)
         case date: ArrowType.Date if date.getUnit == DateUnit.DAY => DateType
         case ts: ArrowType.Timestamp if ts.getUnit == TimeUnit.MICROSECOND => 
TimestampType
    +    case date: ArrowType.Date if date.getUnit == DateUnit.MILLISECOND => 
TimestampType
    --- End diff --
    
    Notice that Spark doesn't have date type with milliseconds, so if we want 
to map to date type, the hours, minutes, ... will be lost. Otherwise we have to 
map to timestamp type.
    Which is the proper behavior? cc @BryanCutler 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to