wangyum commented on a change in pull request #29243:
URL: https://github.com/apache/spark/pull/29243#discussion_r475485835
##########
File path:
sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q16/explain.txt
##########
@@ -1,235 +1,247 @@
== Physical Plan ==
-TakeOrderedAndProject (41)
-+- * HashAggregate (40)
- +- Exchange (39)
- +- * HashAggregate (38)
- +- * HashAggregate (37)
- +- Exchange (36)
- +- * HashAggregate (35)
- +- * Project (34)
- +- * BroadcastHashJoin Inner BuildRight (33)
- :- * Project (27)
- : +- * BroadcastHashJoin Inner BuildRight (26)
- : :- * Project (20)
- : : +- * BroadcastHashJoin Inner BuildRight (19)
- : : :- * BroadcastHashJoin LeftAnti BuildRight
(13)
- : : : :- * Project (9)
- : : : : +- * BroadcastHashJoin LeftSemi
BuildRight (8)
+TakeOrderedAndProject (43)
++- * HashAggregate (42)
+ +- Exchange (41)
+ +- * HashAggregate (40)
+ +- * HashAggregate (39)
+ +- Exchange (38)
+ +- * HashAggregate (37)
+ +- * Project (36)
+ +- * BroadcastHashJoin Inner BuildRight (35)
+ :- * Project (29)
+ : +- * BroadcastHashJoin Inner BuildRight (28)
+ : :- * Project (22)
+ : : +- * BroadcastHashJoin Inner BuildRight (21)
+ : : :- * BroadcastHashJoin LeftAnti BuildRight
(15)
+ : : : :- * Project (10)
+ : : : : +- * BroadcastHashJoin LeftSemi
BuildRight (9)
: : : : :- * Filter (3)
: : : : : +- * ColumnarToRow (2)
: : : : : +- Scan parquet
default.catalog_sales (1)
- : : : : +- BroadcastExchange (7)
- : : : : +- * Project (6)
- : : : : +- * ColumnarToRow (5)
- : : : : +- Scan parquet
default.catalog_sales (4)
- : : : +- BroadcastExchange (12)
- : : : +- * ColumnarToRow (11)
- : : : +- Scan parquet
default.catalog_returns (10)
- : : +- BroadcastExchange (18)
- : : +- * Project (17)
- : : +- * Filter (16)
- : : +- * ColumnarToRow (15)
- : : +- Scan parquet
default.date_dim (14)
- : +- BroadcastExchange (25)
- : +- * Project (24)
- : +- * Filter (23)
- : +- * ColumnarToRow (22)
- : +- Scan parquet
default.customer_address (21)
- +- BroadcastExchange (32)
- +- * Project (31)
- +- * Filter (30)
- +- * ColumnarToRow (29)
- +- Scan parquet default.call_center (28)
+ : : : : +- BroadcastExchange (8)
+ : : : : +- * Project (7)
+ : : : : +- * Filter (6)
+ : : : : +- * ColumnarToRow (5)
+ : : : : +- Scan parquet
default.catalog_sales (4)
+ : : : +- BroadcastExchange (14)
+ : : : +- * Filter (13)
+ : : : +- * ColumnarToRow (12)
+ : : : +- Scan parquet
default.catalog_returns (11)
+ : : +- BroadcastExchange (20)
+ : : +- * Project (19)
+ : : +- * Filter (18)
+ : : +- * ColumnarToRow (17)
+ : : +- Scan parquet
default.date_dim (16)
+ : +- BroadcastExchange (27)
+ : +- * Project (26)
+ : +- * Filter (25)
+ : +- * ColumnarToRow (24)
+ : +- Scan parquet
default.customer_address (23)
+ +- BroadcastExchange (34)
+ +- * Project (33)
+ +- * Filter (32)
+ +- * ColumnarToRow (31)
+ +- Scan parquet default.call_center (30)
(1) Scan parquet default.catalog_sales
Output [7]: [cs_ship_date_sk#1, cs_ship_addr_sk#2, cs_call_center_sk#3,
cs_warehouse_sk#4, cs_order_number#5, cs_ext_ship_cost#6, cs_net_profit#7]
Batched: true
Location: InMemoryFileIndex
[file:/Users/yi.wu/IdeaProjects/spark/sql/core/spark-warehouse/org.apache.spark.sql.TPCDSV1_4_PlanStabilitySuite/catalog_sales]
-PushedFilters: [IsNotNull(cs_ship_date_sk), IsNotNull(cs_ship_addr_sk),
IsNotNull(cs_call_center_sk)]
+PushedFilters: [IsNotNull(cs_ship_date_sk), IsNotNull(cs_ship_addr_sk),
IsNotNull(cs_call_center_sk), IsNotNull(cs_order_number),
IsNotNull(cs_warehouse_sk)]
Review comment:
The inferred `IsNotNull` can filter more data.
Before this change:

After this change:

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]