[ 
https://issues.apache.org/jira/browse/SPARK-31790?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

philipse updated SPARK-31790:
-----------------------------
    Description: 
`CAST(n,TIMESTAMPTYPE)` If n is Byte/Short/Int/Long data type, Hive treat n as 
milliseconds unit , while Spark SQL as seconds unit. so the cast result is 
different,please be care when you use it.

For example:
{code:java}
In spark
spark-sql> select cast(1586318188000 as timestamp);
52238-06-04 13:06:400.0
spark-sql> select cast(1586318188 as timestamp);
2020-04-08 11:56:28

In Hive
hive> select cast(1586318188000 as timestamp);
2020-04-08 11:56:28

hive> select cast(1586318188 as timestamp);
1970-01-19 16:38:38.188{code}
 

  was:`CAST(n,TIMESTAMPTYPE)` If n is Byte/Short/Int/Long data type, Hive treat 
n as milliseconds unit , while Spark SQL as seconds unit. so the cast result is 
different,please be care when you use it


> cast scenarios may generate different results between  Hive and Spark
> ---------------------------------------------------------------------
>
>                 Key: SPARK-31790
>                 URL: https://issues.apache.org/jira/browse/SPARK-31790
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 2.4.5
>            Reporter: philipse
>            Priority: Minor
>
> `CAST(n,TIMESTAMPTYPE)` If n is Byte/Short/Int/Long data type, Hive treat n 
> as milliseconds unit , while Spark SQL as seconds unit. so the cast result is 
> different,please be care when you use it.
> For example:
> {code:java}
> In spark
> spark-sql> select cast(1586318188000 as timestamp);
> 52238-06-04 13:06:400.0
> spark-sql> select cast(1586318188 as timestamp);
> 2020-04-08 11:56:28
> In Hive
> hive> select cast(1586318188000 as timestamp);
> 2020-04-08 11:56:28
> hive> select cast(1586318188 as timestamp);
> 1970-01-19 16:38:38.188{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to