wzhfy commented on a change in pull request #29589:
URL: https://github.com/apache/spark/pull/29589#discussion_r483404918
##########
File path:
sql/core/src/test/scala/org/apache/spark/sql/DynamicPartitionPruningSuite.scala
##########
@@ -1342,6 +1345,52 @@ abstract class DynamicPartitionPruningSuiteBase
}
}
}
+
+ test("SPARK-32748: propagate local properties to dynamic pruning thread") {
+ def checkPropertyValueByUdfResult(propKey: String, propValue: String):
Unit = {
+ spark.sparkContext.setLocalProperty(propKey, propValue)
+ val df = sql(
+ s"""
+ |SELECT compare_property_value(f.date_id, '$propKey', '$propValue')
as col
+ |FROM fact_sk f
+ |INNER JOIN dim_store s
+ |ON f.store_id = s.store_id AND s.country = 'NL'
+ """.stripMargin)
+
+ checkPartitionPruningPredicate(df, false, true)
+ assert(df.collect().forall(_.toSeq == Seq(true)))
+ }
+
+ try {
+
SQLConf.get.setConf(StaticSQLConf.BROADCAST_EXCHANGE_MAX_THREAD_THRESHOLD, 1)
Review comment:
Yes you are right.. but it's not because it's a static conf, it's
because `executionContext` is in the `SubqueryBroadcastExec` object.
This makes it hard to write unit test. Do you have any suggestion?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]