[
https://issues.apache.org/jira/browse/SPARK-13341?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-13341.
-------------------------------
Resolution: Duplicate
> Casting Unix timestamp to SQL timestamp fails
> ---------------------------------------------
>
> Key: SPARK-13341
> URL: https://issues.apache.org/jira/browse/SPARK-13341
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: William Dee
>
> The way that unix timestamp casting is handled has been broken between Spark
> 1.5.2 and Spark 1.6.0. This can be easily demonstrated via the spark-shell:
> {code:title=1.5.2}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts,
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> | ts| d|
> +--------------------+----------+
> |2016-02-16 00:00:...|2016-02-16|
> +--------------------+----------+
> {code}
> {code:title=1.6.0}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts,
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> | ts| d|
> +--------------------+----------+
> |48095-07-09 12:06...|095-07-09|
> +--------------------+----------+
> {code}
> I'm not sure what exactly is causing this but this defect has definitely been
> introduced in Spark 1.6.0 as jobs that relied on this functionality ran on
> 1.5.2 and now don't run on 1.6.0.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]