[ 
https://issues.apache.org/jira/browse/SPARK-45199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-45199:
-----------------------------------
    Labels: pull-request-available  (was: )

> Release cast from attribute in filter to support predicate push down
> --------------------------------------------------------------------
>
>                 Key: SPARK-45199
>                 URL: https://issues.apache.org/jira/browse/SPARK-45199
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.5.0
>            Reporter: Wechar
>            Priority: Major
>              Labels: pull-request-available
>
> When a {{cast clause}} was used in a column in a filter, it will not be able 
> to push down. We can remove the cast from attribute side to support predicate 
> push down if the cast not change the precision or range.
> Test query:
> {code:sql}
> -- dt is string type
> explain select * from wechar_tbl where cast(dt as date) = 
> date_sub(current_date(), 1);
> {code}
> Before this patch:
> {code:bash}
> == Physical Plan ==
> *(1) ColumnarToRow
> +- FileScan parquet default.wechar_tbl[id#5,name#6,dt#7] Batched: true, 
> DataFilters: [], Format: Parquet, Location: InMemoryFileIndex(0 paths)[], 
> PartitionFilters: [isnotnull(dt#7), (cast(dt#7 as date) = 2023-09-17)], 
> PushedFilters: [], ReadSchema: struct<id:int,name:string>
> {code}
> *cast(dt#7 as date) = 2023-09-17 can not be push down in partition filter*
> After this patch:
> {code:bash}
> == Physical Plan ==
> *(1) ColumnarToRow
> +- FileScan parquet default.wechar_tbl[id#62,name#63,dt#64] Batched: true, 
> DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[], 
> PartitionFilters: [isnotnull(dt#64), (dt#64 = 2023-09-17)], PushedFilters: 
> [], ReadSchema: struct<id:int,name:string>
> {code}
> *(dt#64 = 2023-09-17) can be push down in partition filter*



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to