Github user adrian-wang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6759#discussion_r32197105
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
    @@ -498,69 +493,21 @@ private[parquet] object CatalystArrayConverter {
     }
     
     private[parquet] object CatalystTimestampConverter {
    -  // TODO most part of this comes from Hive-0.14
    -  // Hive code might have some issues, so we need to keep an eye on it.
    -  // Also we use NanoTime and Int96Values from parquet-examples.
    -  // We utilize jodd to convert between NanoTime and Timestamp
    -  val parquetTsCalendar = new ThreadLocal[Calendar]
    -  def getCalendar: Calendar = {
    -    // this is a cache for the calendar instance.
    -    if (parquetTsCalendar.get == null) {
    -      
parquetTsCalendar.set(Calendar.getInstance(TimeZone.getTimeZone("GMT")))
    -    }
    -    parquetTsCalendar.get
    -  }
    -  val NANOS_PER_SECOND: Long = 1000000000
    -  val SECONDS_PER_MINUTE: Long = 60
    -  val MINUTES_PER_HOUR: Long = 60
    -  val NANOS_PER_MILLI: Long = 1000000
    +  // see 
http://stackoverflow.com/questions/466321/convert-unix-timestamp-to-julian
    +  val JULIAN_DAY_OF_EPOCH = 2440587.5
    --- End diff --
    
    The old code is simply rewrite from hive 0.14 src code. we need to use hive 
to write some data in parquet and read it use Spark SQL, and also write some 
data in parquet with Spark and read from hive to make sure it is compatible. 
Once we are compatible with Hive, we are also compatible with Impala.
    
    cc @yhuai @felixcheung 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to