juliuszsompolski commented on a change in pull request #29999:
URL: https://github.com/apache/spark/pull/29999#discussion_r545189045
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -216,6 +216,18 @@ object SQLConf {
"for using switch statements in InSet must be non-negative and less
than or equal to 600")
.createWithDefault(400)
+ val OPTIMIZER_LIKE_ALL_CONVERSION_THRESHOLD =
+ buildConf("spark.sql.optimizer.likeAllConversionThreshold")
+ .internal()
+ .doc("Configure the maximum size of the pattern sequence in like all.
Spark will convert " +
+ "the logical combination of like to avoid StackOverflowError. 200 is
an empirical value " +
+ "that will not cause StackOverflowError.")
+ .version("3.1.0")
+ .intConf
+ .checkValue(threshold => threshold >= 0, "The maximum size of pattern
sequence " +
+ "in like all must be non-negative")
+ .createWithDefault(200)
Review comment:
A tree of 200 And-reduced expressions is already a huge expr tree.
I think this could be useful and helpful with a default threshold of 5 or so
already.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]