wangfei created SPARK-4938:
------------------------------

             Summary: Adding optimization to simplify the filter condition
                 Key: SPARK-4938
                 URL: https://issues.apache.org/jira/browse/SPARK-4938
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 1.1.0
            Reporter: wangfei
             Fix For: 1.3.0


Adding optimization to simplify the filter condition:
1  condition that can get the boolean result such as:
a < 3 && a > 5     ----  False
a < 1 || a > 0        ---- True

2 Simplify And, Or condition, such as the sql (one of hive-testbench
):
select
    sum(l_extendedprice* (1 - l_discount)) as revenue
from
    lineitem,
    part
where
    (
        p_partkey = l_partkey
        and p_brand = 'Brand#32'
        and p_container in ('SM CASE', 'SM BOX', 'SM PACK', 'SM PKG')
        and l_quantity >= 7 and l_quantity <= 7 + 10
        and p_size between 1 and 5
        and l_shipmode in ('AIR', 'AIR REG')
        and l_shipinstruct = 'DELIVER IN PERSON'
    )
    or
    (
        p_partkey = l_partkey
        and p_brand = 'Brand#35'
        and p_container in ('MED BAG', 'MED BOX', 'MED PKG', 'MED PACK')
        and l_quantity >= 15 and l_quantity <= 15 + 10
        and p_size between 1 and 10
        and l_shipmode in ('AIR', 'AIR REG')
        and l_shipinstruct = 'DELIVER IN PERSON'
    )
    or
    (
        p_partkey = l_partkey
        and p_brand = 'Brand#24'
        and p_container in ('LG CASE', 'LG BOX', 'LG PACK', 'LG PKG')
        and l_quantity >= 26 and l_quantity <= 26 + 10
        and p_size between 1 and 15
        and l_shipmode in ('AIR', 'AIR REG')
        and l_shipinstruct = 'DELIVER IN PERSON'
    );
 Before optimized it is a CartesianProduct, in my locally test this sql hang 
and can not get result, after optimization the CartesianProduct replaced by 
ShuffledHashJoin, which only need 20+ seconds to run this sql.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to