[ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14594215#comment-14594215
 ] 

Yin Huai commented on SPARK-8420:
---------------------------------

Will resolve this one after 1.4 backport is merged.

> Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
> ----------------------------------------------------------------------
>
>                 Key: SPARK-8420
>                 URL: https://issues.apache.org/jira/browse/SPARK-8420
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Justin Yip
>            Assignee: Michael Armbrust
>            Priority: Blocker
>              Labels: releasenotes
>             Fix For: 1.5.0
>
>
> I am trying out 1.4.0 and notice there are some differences in behavior with 
> Timestamp between 1.3.1 and 1.4.0. 
> In 1.3.1, I can compare a Timestamp with string.
> {code}
> scala> val df = sqlContext.createDataFrame(Seq((1, 
> Timestamp.valueOf("2015-01-01 00:00:00")), (2, Timestamp.valueOf("2014-01-01 
> 00:00:00"))))
> ...
> scala> df.filter($"_2" <= "2014-06-01").show
> ...
> _1 _2                  
> 2  2014-01-01 00:00:...
> {code}
> However, in 1.4.0, the filter is always false:
> {code}
> scala> val df = sqlContext.createDataFrame(Seq((1, 
> Timestamp.valueOf("2015-01-01 00:00:00")), (2, Timestamp.valueOf("2014-01-01 
> 00:00:00"))))
> df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
> scala> df.filter($"_2" <= "2014-06-01").show
> +--+--+
> |_1|_2|
> +--+--+
> +--+--+
> {code}
> Not sure if that is intended, but I cannot find any doc mentioning these 
> inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to