[ 
https://issues.apache.org/jira/browse/FLINK-24014?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17434748#comment-17434748
 ] 

jocean.shi commented on FLINK-24014:
------------------------------------

Hi [~airblader] I don't understand this case, the pre record is +I[key1, a=9], 
I think the next recore will be -D[key1, a=9], not the -D[key1, a=10].

if the next record is -D[key1, a=10] it will be error even not pushdown.

> Filters pushed through ChangelogNormalize cause incorrect results
> -----------------------------------------------------------------
>
>                 Key: FLINK-24014
>                 URL: https://issues.apache.org/jira/browse/FLINK-24014
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / Planner
>    Affects Versions: 1.13.2
>            Reporter: Ingo Bürk
>            Assignee: Ingo Bürk
>            Priority: Major
>
> If a source implements SupportsFilterPushDown, all filters of a query get 
> pushed through to the source, including past a ChangelogNormalize.
> However, this can cause incorrect results as pushing non-PK filters is 
> incorrect. For example, consider a filter a < 10 and records +I[key1, a=9], 
> -D[key1, a=10].
> A strategy to fix it could be a physical rule which deoptimizes the filter 
> pushdown, however in this case we should investigate more how strict we need 
> to be. For now we could also simply document this in the 
> SupportsFilterPushDown interface and warn implementations of this possibility.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to