AngersZhuuuu commented on a change in pull request #32001:
URL: https://github.com/apache/spark/pull/32001#discussion_r604070108



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
##########
@@ -526,6 +533,20 @@ abstract class CastBase extends UnaryExpression with 
TimeZoneAwareExpression wit
       buildCast[UTF8String](_, s => IntervalUtils.safeStringToInterval(s))
   }
 
+  private[this] def castToDayTimeInterval(from: DataType): Any => Any = from 
match {
+    case x: IntegralType if ansiEnabled =>

Review comment:
       > We don't need to check the ANSI flag. The ANSI intervals suppose the 
ANSI mode. Just use exact ops. So, we don't have to support non-ANSI mode for 
ANSI intervals.
   
   So only convert to these type we don't need check ANSI enabled, other 
convert also need to check?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to