[
https://issues.apache.org/jira/browse/FLINK-8492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16335891#comment-16335891
]
Hequn Cheng commented on FLINK-8492:
------------------------------------
Hi [~fhueske], thanks for your reply. We can move calc rules
(FilterToCalcRule,ProjectToCalcRule and CalcMergeRule) into normalize phrase,
this will make the above test case pass, but the current logical optimize phase
may re-introduce multi calcs due to project push down for other cases, and
volcano seems to chose the un-merged plan as the smallest cost plan.
> Fix unsupported exception for udtf with multi calc
> --------------------------------------------------
>
> Key: FLINK-8492
> URL: https://issues.apache.org/jira/browse/FLINK-8492
> Project: Flink
> Issue Type: Bug
> Components: Table API & SQL
> Reporter: Hequn Cheng
> Assignee: Hequn Cheng
> Priority: Major
>
> Considering the following test, unsupported exception will be thrown due to
> multi calc existing between correlate and TableFunctionScan.
> {code:java}
> // code placeholder
> @Test
> def testCrossJoinWithMultiFilter(): Unit = {
> val t = testData(env).toTable(tEnv).as('a, 'b, 'c)
> val func0 = new TableFunc0
> val result = t
> .join(func0('c) as('d, 'e))
> .select('c, 'd, 'e)
> .where('e > 10)
> .where('e > 20)
> .select('c, 'd)
> .toAppendStream[Row]
> result.addSink(new StreamITCase.StringSink[Row])
> env.execute()
> val expected = mutable.MutableList("Jack#22,Jack,22", "Anna#44,Anna,44")
> assertEquals(expected.sorted, StreamITCase.testResults.sorted)
> }
> {code}
> I can see two options to fix this problem:
> # Adapt calcite OptRule to merge the continuous calc.
> # Merge multi calc in correlate convert rule.
> I prefer the second one, not only it is easy to implement but also i think
> with or without an optimize rule should not influence flink functionality.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)