Github user gvramana commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1115#discussion_r125459816
  
    --- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/execution/CastExpressionOptimization.scala
 ---
    @@ -66,6 +68,19 @@ object CastExpressionOptimization {
         }
       }
     
    +  def typeCastStringToLongForDateType(v: Any): Any = {
    +    try {
    +      // spark also uses castToTimestamp only to convert  time to long.So 
to syn with spark ,
    +      // Filter cast format should be same so used castToTimestamp method .
    +      // Spark uses it in Cast.scala under ConstantFolding Rule before 
carbon optimizer)
    +      val value = 
DateTimeUtils.stringToTimestamp(UTF8String.fromString(v.toString)).get
    --- End diff --
    
    When parse fails it gives none, none case behaviour needs to validate as 
per hive, whether null values are considered in output or not.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to