MaxGekk commented on issue #25678: [SPARK-28973][SQL] Add `TimeType` and 
support `java.time.LocalTime` as its external type.
URL: https://github.com/apache/spark/pull/25678#issuecomment-530013450
 
 
   > how is TimeType mapped to Parquet TIME here?
   
   I think in the same way as `TIMESTAMP`. We should copy-paste micros from a 
row and store them as `INT64`, like there: 
https://github.com/apache/spark/blob/26998b86c13e79582a3df31f6184f825cde45e73/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetWriteSupport.scala#L180-L182
   
   The physical Parquet type will be `INT64`, and logical type will be 
`TIME_MICROS`, like: 
https://github.com/apache/spark/blob/be2238fb502b0f49a8a1baa6da9bc3e99540b40e/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala#L390-L391
   
   > Let's say I wrote to some other system that didn't have such a type, like 
an RDBMS. It would end up a long? wouldn't cause any particular problem right?
   
   `TIME` is standard SQL type. If the RDBMS does not support it (that should 
be rare case), we will store our `TIME` value as `long`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to