Github user davies commented on a diff in the pull request:
https://github.com/apache/spark/pull/6759#discussion_r32387873
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -498,69 +493,21 @@ private[parquet] object CatalystArrayConverter {
}
private[parquet] object CatalystTimestampConverter {
- // TODO most part of this comes from Hive-0.14
- // Hive code might have some issues, so we need to keep an eye on it.
- // Also we use NanoTime and Int96Values from parquet-examples.
- // We utilize jodd to convert between NanoTime and Timestamp
- val parquetTsCalendar = new ThreadLocal[Calendar]
- def getCalendar: Calendar = {
- // this is a cache for the calendar instance.
- if (parquetTsCalendar.get == null) {
-
parquetTsCalendar.set(Calendar.getInstance(TimeZone.getTimeZone("GMT")))
- }
- parquetTsCalendar.get
- }
- val NANOS_PER_SECOND: Long = 1000000000
- val SECONDS_PER_MINUTE: Long = 60
- val MINUTES_PER_HOUR: Long = 60
- val NANOS_PER_MILLI: Long = 1000000
+ // see
http://stackoverflow.com/questions/466321/convert-unix-timestamp-to-julian
+ val JULIAN_DAY_OF_EPOCH = 2440587.5
--- End diff --
I had verified this using the sample parquet file in SPARK-4768, it can
read by exact the same value back (with timzone difference).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]