[ 
https://issues.apache.org/jira/browse/SPARK-28955?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17190322#comment-17190322
 ] 

Bill Schneider edited comment on SPARK-28955 at 9/3/20, 5:25 PM:
-----------------------------------------------------------------

Changing to new feature; this is a recurring issue when dealing with different 
Spark jobs running in different timezones when we want the time to remain fixed 
regardless of time zone.  (e.g., local time semantics) 

In other words: I would like to be able to parse the same CSV (or query the 
same TIMESTAMP columns over JDBC) and get consistent results regardless of the 
timezone that Spark is running in.  Today the generated Parquet will be 
different depending on the Spark job's timezone. 


was (Author: wrschneider99):
Changing to new feature; this is a recurring issue when dealing with different 
Spark jobs running in different timezones when we want the time to remain fixed 
regardless of time zone.  (e.g., local time semantics) 

> Support for LocalDateTime semantics
> -----------------------------------
>
>                 Key: SPARK-28955
>                 URL: https://issues.apache.org/jira/browse/SPARK-28955
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Bill Schneider
>            Priority: Major
>
> It would be great if Spark supported local times in DataFrames, rather than 
> only instants. 
> The specific use case I have in mind is something like
>  * parse "2019-01-01 17:00" (no timezone) from CSV -> LocalDateTime in 
> dataframe
>  * save to Parquet: LocalDateTime is stored with same integer value as 
> 2019-01-01 17:00 UTC, but with isAdjustedToUTC=false.  (Currently Spark saves 
> either INT96 or TIME_MILLIS/TIME_MICROS which has isAdjustedToUTC=true)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to