dongjoon-hyun commented on code in PR #44133:
URL: https://github.com/apache/spark/pull/44133#discussion_r1413192567
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -564,6 +564,14 @@ object SQLConf {
.booleanConf
.createWithDefault(false)
+ val UNWRAP_CAST_IN_JOIN_CONDITION_ENABLED =
+ buildConf("spark.sql.unwrapCastInJoinCondition.enabled")
+ .doc("When true, unwrap the cast in the join condition to reduce shuffle
if they are " +
+ "integral types.")
+ .version("4.0.0")
+ .booleanConf
+ .createWithDefault(true)
Review Comment:
Although this enables this by default, it seems that there is no change on
the generated plans in test suites.
Does it mean Apache Spark test coverage didn't have the cases so far?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]