HyukjinKwon opened a new pull request, #40117: URL: https://github.com/apache/spark/pull/40117
### What changes were proposed in this pull request? This PR proposes to disable ANSI for several conv test cases in `MathFunctionsSuite`. They are intentionally testing the behaviours when ANSI is disabled. Exception cases are already handled in https://github.com/apache/spark/commit/cb463fb40e8f663b7e3019c8d8560a3490c241d0 I believe. ### Why are the changes needed? To make the ANSI tests pass. It currently fails (https://github.com/apache/spark/actions/runs/4228390267/jobs/7343793692): ``` 2023-02-21T03:03:20.3799795Z [0m[[0m[0minfo[0m] [0m[0m[32m- SPARK-33428 conv function should trim input string (177 milliseconds)[0m[0m 2023-02-21T03:03:20.4252604Z 03:03:20.424 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 138.0 (TID 256) 2023-02-21T03:03:20.4253602Z org.apache.spark.SparkArithmeticException: [ARITHMETIC_OVERFLOW] Overflow in function conv(). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error. 2023-02-21T03:03:20.4254440Z at org.apache.spark.sql.errors.QueryExecutionErrors$.arithmeticOverflowError(QueryExecutionErrors.scala:643) 2023-02-21T03:03:20.4255265Z at org.apache.spark.sql.errors.QueryExecutionErrors$.overflowInConvError(QueryExecutionErrors.scala:315) 2023-02-21T03:03:20.4256001Z at org.apache.spark.sql.catalyst.util.NumberConverter$.encode(NumberConverter.scala:68) 2023-02-21T03:03:20.4256888Z at org.apache.spark.sql.catalyst.util.NumberConverter$.convert(NumberConverter.scala:158) 2023-02-21T03:03:20.4257450Z at org.apache.spark.sql.catalyst.util.NumberConverter.convert(NumberConverter.scala) 2023-02-21T03:03:20.4258084Z at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(generated.java:38) 2023-02-21T03:03:20.4258720Z at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) 2023-02-21T03:03:20.4259293Z at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:760) 2023-02-21T03:03:20.4259769Z at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460) 2023-02-21T03:03:20.4260157Z at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460) 2023-02-21T03:03:20.4260535Z at org.apache.spark.util.Iterators$.size(Iterators.scala:29) 2023-02-21T03:03:20.4260918Z at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1944) 2023-02-21T03:03:20.4261283Z at org.apache.spark.rdd.RDD.$anonfun$count$1(RDD.scala:1266) 2023-02-21T03:03:20.4261649Z at org.apache.spark.rdd.RDD.$anonfun$count$1$adapted(RDD.scala:1266) 2023-02-21T03:03:20.4262050Z at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303) 2023-02-21T03:03:20.4262726Z at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92) 2023-02-21T03:03:20.4263206Z at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161) 2023-02-21T03:03:20.4263628Z at org.apache.spark.scheduler.Task.run(Task.scala:139) 2023-02-21T03:03:20.4264227Z at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554) 2023-02-21T03:03:20.4265048Z at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1520) 2023-02-21T03:03:20.4266209Z at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557) 2023-02-21T03:03:20.4266805Z at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 2023-02-21T03:03:20.4267369Z at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 2023-02-21T03:03:20.4267799Z at java.lang.Thread.run(Thread.java:750) ``` ### Does this PR introduce _any_ user-facing change? No, test-only. ### How was this patch tested? Fixed unittests. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
