Eric5553 commented on a change in pull request #26977: [SPARK-30326][SQL] Raise
exception if analyzer exceed max iterations
URL: https://github.com/apache/spark/pull/26977#discussion_r376694303
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/rules/RuleExecutor.scala
##########
@@ -156,7 +163,7 @@ abstract class RuleExecutor[TreeType <: TreeNode[_]]
extends Logging {
// Only log if this is a rule that is supposed to run more than once.
if (iteration != 2) {
val message = s"Max iterations (${iteration - 1}) reached for
batch ${batch.name}"
- if (Utils.isTesting) {
+ if (Utils.isTesting || batch.strategy.errorOnExceed) {
throw new TreeNodeException(curPlan, message, null)
Review comment:
IMO, here is the common logic in `RuleExecutor` to handle `errorOnExceed`
strategy, which should not include specific knowledge `ANALYZER_MAX_ITERATIONS`
of inheritance class `analyzer`. This will lead to potential bug when we
introduce more MAX_ITERATION setting?
Maybe we can use a common phrase here like 'increasing the value of
corresponding SQLConf setting'. Or an alternative way of adding a string hint
field in `Strategy`, and set it in `Analyzer` together with `errorOnExceed`
flag?
What do you think ? @gatorsmile Thanks a lot!
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]