Github user mgaido91 commented on a diff in the pull request:
https://github.com/apache/spark/pull/22395#discussion_r217977048
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala
---
@@ -143,16 +143,14 @@ class ArithmeticExpressionSuite extends SparkFunSuite
with ExpressionEvalHelper
}
}
- // By fixing SPARK-15776, Divide's inputType is required to be
DoubleType of DecimalType.
- // TODO: in future release, we should add a IntegerDivide to support
integral types.
- ignore("/ (Divide) for integral type") {
- checkEvaluation(Divide(Literal(1.toByte), Literal(2.toByte)), 0.toByte)
- checkEvaluation(Divide(Literal(1.toShort), Literal(2.toShort)),
0.toShort)
- checkEvaluation(Divide(Literal(1), Literal(2)), 0)
- checkEvaluation(Divide(Literal(1.toLong), Literal(2.toLong)), 0.toLong)
- checkEvaluation(Divide(positiveShortLit, negativeShortLit), 0.toShort)
- checkEvaluation(Divide(positiveIntLit, negativeIntLit), 0)
- checkEvaluation(Divide(positiveLongLit, negativeLongLit), 0L)
+ test("/ (Divide) for integral type") {
+ checkEvaluation(IntegralDivide(Literal(1.toByte), Literal(2.toByte)),
0L)
+ checkEvaluation(IntegralDivide(Literal(1.toShort),
Literal(2.toShort)), 0L)
+ checkEvaluation(IntegralDivide(Literal(1), Literal(2)), 0L)
+ checkEvaluation(IntegralDivide(Literal(1.toLong), Literal(2.toLong)),
0L)
+ checkEvaluation(IntegralDivide(positiveShortLit, negativeShortLit), 0L)
+ checkEvaluation(IntegralDivide(positiveIntLit, negativeIntLit), 0L)
+ checkEvaluation(IntegralDivide(positiveLongLit, negativeLongLit), 0L)
--- End diff --
The test for this case is present in `operators.sql` (anyway, if you prefer
me to add a case here too, just let me know and I'll add it). And since we
already have this function in our code indeed - it is just translated to a
normal divide + a cast - currently we are returning `null` and throwing an
exception for it would be a behavior change (and a quite disruptive too). Do we
really want to follow Hive's behavior on this?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]