cloud-fan commented on a change in pull request #34169:
URL: https://github.com/apache/spark/pull/34169#discussion_r721854030



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
##########
@@ -160,11 +162,15 @@ case class Abs(child: Expression, failOnError: Boolean = 
SQLConf.get.ansiEnabled
 
   def this(child: Expression) = this(child, SQLConf.get.ansiEnabled)
 
-  override def inputTypes: Seq[AbstractDataType] = Seq(NumericType)
+  override def inputTypes: Seq[AbstractDataType] = 
Seq(TypeCollection.NumericAndInterval)

Review comment:
       We shouldn't use `TypeCollection.NumericAndInterval`, as it includes the 
legacy interval. We will hit runtime exception if we use legacy interval as 
input, but analysis exception is preferred.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to