Github user byF commented on the pull request:

    https://github.com/apache/spark/pull/2084#issuecomment-53377486
  
    I'll add the test case once I get to my office today.
    
    Exactly, the parser supported only `CAST(... AS STRING)`. However, it would 
be nice if I could do:
    
    ```
    WHERE timestamp >= '2012-07-16 00:00:00'
                 AND timestamp <= '2012-07-16 01:00:00'
    ```
    
    instead of
    
    ```
    WHERE timestamp >= CAST('2012-07-16 00:00:00' AS TIMESTAMP)
                 AND timestamp <= CAST('2012-07-16 01:00:00' AS TIMESTAMP)
    ```
    
    Parsing this kind of expressions happens in 
`SqlParser.comparisonExpression`; I consider modifying `predicates.scala` in a 
way that would say: 
    
    > If a queried expression has the TimestampType, replace the value 
expression with `Cast`. 
    
    The way the rest of the code is written it should probably work out of box. 
What do you think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to