revans2 commented on a change in pull request #35913:
URL: https://github.com/apache/spark/pull/35913#discussion_r831171271



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/functions.scala
##########
@@ -1783,7 +1783,9 @@ object functions {
    * @group math_funcs
    * @since 1.4.0
    */
-  def ceil(e: Column): Column = ceil(e, lit(0))
+  def ceil(e: Column): Column = withExpr {
+    UnresolvedFunction(Seq("ceil"), Seq(e.expr), isDistinct = false)

Review comment:
       I added test cases that explicitly check the result type.
   
   From a consistency standpoint if the return type is going to depend on the 
scale, then the scale can only ever be a literal value. If we want to break 
backwards compatibility, then I would suggest that we also fix the overflow 
issue https://issues.apache.org/jira/browse/SPARK-28135 with double being round 
to a long. That technically also applies to a double being cast to a decimal 
type and then rounded.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to