wangyum commented on a change in pull request #31984:
URL: https://github.com/apache/spark/pull/31984#discussion_r603167405
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/PartitionPruning.scala
##########
@@ -48,6 +49,10 @@ import
org.apache.spark.sql.execution.datasources.{HadoopFsRelation, LogicalRela
*/
object PartitionPruning extends Rule[LogicalPlan] with PredicateHelper with
JoinSelectionHelper {
+ private val buildBroadcastThreshold = math.max(
+
AUTO_BROADCASTJOIN_THRESHOLD.defaultValue.getOrElse(conf.autoBroadcastJoinThreshold),
Review comment:
To avoid disable DPP by setting `autoBroadcastJoinThreshold ` to a small
value. For example:
1. If user set autoBroadcastJoinThreshold to -1 and has benefit:
Before this PR:
spark.sql.optimizer.dynamicPartitionPruning.reuseBroadcastOnly |
filtering side data size <= 10MB | filtering side data size > 10MB
-- | -- | --
true | N | N
false | Y | Y
After this PR:
spark.sql.optimizer.dynamicPartitionPruning.reuseBroadcastOnly |
filtering side data size <= 10MB | filtering side data size > 10MB
-- | -- | --
true | N | N
false | Y | N
2. If user set autoBroadcastJoinThreshold to 100MB and has benefit:
Before this PR:
spark.sql.optimizer.dynamicPartitionPruning.reuseBroadcastOnly |
filtering side data size <= 100MB | filtering side data size > 100MB
-- | -- | --
true | Y | N
false | Y | Y
After this PR:
spark.sql.optimizer.dynamicPartitionPruning.reuseBroadcastOnly |
filtering side data size <= 100MB | filtering side data size > 100MB
-- | -- | --
true | Y | N
false | Y | N
`Y` means will insert a DPP, `N` means will not insert a DPP. It is much
reasonable than before.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]