sathiyapk commented on a change in pull request #34729:
URL: https://github.com/apache/spark/pull/34729#discussion_r772972255



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
##########
@@ -249,9 +249,9 @@ case class Cbrt(child: Expression) extends 
UnaryMathExpression(math.cbrt, "CBRT"
   """,
   since = "1.4.0",
   group = "math_funcs")
-case class Ceil(child: Expression) extends UnaryMathExpression(math.ceil, 
"CEIL") {
+ case class Ceil(child: Expression) extends UnaryMathExpression(math.ceil, 
"CEIL") {
   override def dataType: DataType = child.dataType match {

Review comment:
       I thought the same, but the `eval` of RoundBase does it.
   
   ```
   override def eval(input: InternalRow): Any = {
       if (scaleV == null) { // if scale is null, no need to eval its child at 
all
         null
       } else {
         val evalE = child.eval(input)
         if (evalE == null) {
           null
         } else {
           nullSafeEval(evalE)
         }
       }
     }
   ```
   
   So if we must define the data type of the return expression, we could do 
something like : 
   ```
   if (scaleV == null) return Literal.create(null, dtNullScale(child.dataType))
   
   def dtNullScale(dt: DataType): DataType = dt match {
     case _ : FloatType | _ : DoubleType => DoubleType
     case _ => LongType
   }
   ```
   
   What do you think ?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to