This is an automated email from the ASF dual-hosted git repository.
lincoln pushed a commit to branch release-1.17
in repository https://gitbox.apache.org/repos/asf/flink.git
The following commit(s) were added to refs/heads/release-1.17 by this push:
new 30ec036f4d7 [FLINK-32457][table-planner] Add test cases of unsupported
usage for JSON_OBJECTAGG & JSON_ARRAYAGG and also update docs
30ec036f4d7 is described below
commit 30ec036f4d7cac1523bff60e224019ad631c5077
Author: lincoln lee <[email protected]>
AuthorDate: Wed Jul 19 10:58:38 2023 +0800
[FLINK-32457][table-planner] Add test cases of unsupported usage for
JSON_OBJECTAGG & JSON_ARRAYAGG and also update docs
This closes #22976
---
docs/data/sql_functions.yml | 72 ++++++------
docs/data/sql_functions_zh.yml | 66 +++++------
.../WrapJsonAggFunctionArgumentsRuleTest.java | 112 +++++++++++-------
.../WrapJsonAggFunctionArgumentsRuleTest.xml | 127 ++++++++++++++++++---
4 files changed, 246 insertions(+), 131 deletions(-)
diff --git a/docs/data/sql_functions.yml b/docs/data/sql_functions.yml
index 205dcae743c..aaeee9803f9 100644
--- a/docs/data/sql_functions.yml
+++ b/docs/data/sql_functions.yml
@@ -842,25 +842,6 @@ json:
)
)
```
- - sql: JSON_OBJECTAGG([KEY] key VALUE value [ { NULL | ABSENT } ON NULL ])
- table: jsonObjectAgg(JsonOnNull, keyExpression, valueExpression)
- description: |
- Builds a JSON object string by aggregating key-value expressions into a
single JSON object.
-
- The key expression must return a non-nullable character string. Value
expressions can be
- arbitrary, including other JSON functions. If a value is `NULL`, the `ON
NULL` behavior
- defines what to do. If omitted, `NULL ON NULL` is assumed by default.
-
- Note that keys must be unique. If a key occurs multiple times, an error
will be thrown.
-
- This function is currently not supported in `OVER` windows.
-
- ```sql
- -- '{"Apple":2,"Banana":17,"Orange":0}'
- SELECT
- JSON_OBJECTAGG(KEY product VALUE cnt)
- FROM orders
- ```
- sql: JSON_ARRAY([value]* [ { NULL | ABSENT } ON NULL ])
table: jsonArray(JsonOnNull, values...)
description: |
@@ -890,23 +871,6 @@ json:
-- '[[1]]'
JSON_ARRAY(JSON_ARRAY(1))
```
- - sql: JSON_ARRAYAGG(items [ { NULL | ABSENT } ON NULL ])
- table: jsonArrayAgg(JsonOnNull, itemExpression)
- description: |
- Builds a JSON object string by aggregating items into an array.
-
- Item expressions can be arbitrary, including other JSON functions. If a
value is `NULL`, the
- `ON NULL` behavior defines what to do. If omitted, `ABSENT ON NULL` is
assumed by default.
-
- This function is currently not supported in `OVER` windows, unbounded
session windows, or hop
- windows.
-
- ```sql
- -- '["Apple","Banana","Orange"]'
- SELECT
- JSON_ARRAYAGG(product)
- FROM orders
- ```
valueconstruction:
- sql: |
@@ -1048,6 +1012,42 @@ aggregate:
Divides the rows for each window partition into `n` buckets ranging from
1 to at most `n`.
If the number of rows in the window partition doesn't divide evenly into
the number of buckets, then the remainder values are distributed one per
bucket, starting with the first bucket.
For example, with 6 rows and 4 buckets, the bucket values would be as
follows: 1 1 2 2 3 4
+ - sql: JSON_OBJECTAGG([KEY] key VALUE value [ { NULL | ABSENT } ON NULL ])
+ table: jsonObjectAgg(JsonOnNull, keyExpression, valueExpression)
+ description: |
+ Builds a JSON object string by aggregating key-value expressions into a
single JSON object.
+
+ The key expression must return a non-nullable character string. Value
expressions can be
+ arbitrary, including other JSON functions. If a value is `NULL`, the `ON
NULL` behavior
+ defines what to do. If omitted, `NULL ON NULL` is assumed by default.
+
+ Note that keys must be unique. If a key occurs multiple times, an error
will be thrown.
+
+ This function is currently not supported in `OVER` windows and is not
supported for use with other aggregate functions.
+
+ ```sql
+ -- '{"Apple":2,"Banana":17,"Orange":0}'
+ SELECT
+ JSON_OBJECTAGG(KEY product VALUE cnt)
+ FROM orders
+ ```
+ - sql: JSON_ARRAYAGG(items [ { NULL | ABSENT } ON NULL ])
+ table: jsonArrayAgg(JsonOnNull, itemExpression)
+ description: |
+ Builds a JSON object string by aggregating items into an array.
+
+ Item expressions can be arbitrary, including other JSON functions. If a
value is `NULL`, the
+ `ON NULL` behavior defines what to do. If omitted, `ABSENT ON NULL` is
assumed by default.
+
+ This function is currently not supported in `OVER` windows, unbounded
session windows, or hop
+ windows. And it is not supported for use with other aggregate functions.
+
+ ```sql
+ -- '["Apple","Banana","Orange"]'
+ SELECT
+ JSON_ARRAYAGG(product)
+ FROM orders
+ ```
catalog:
- sql: CURRENT_DATABASE()
diff --git a/docs/data/sql_functions_zh.yml b/docs/data/sql_functions_zh.yml
index 04f8757a4be..1c432c2d18a 100644
--- a/docs/data/sql_functions_zh.yml
+++ b/docs/data/sql_functions_zh.yml
@@ -941,24 +941,6 @@ json:
)
)
```
- - sql: JSON_OBJECTAGG([KEY] key VALUE value [ { NULL | ABSENT } ON NULL ])
- table: jsonObjectAgg(JsonOnNull, keyExpression, valueExpression)
- description: |
- 通过将 key-value 聚合到单个 JSON 对象中,构建 JSON 对象字符串。
-
- 键表达式必须返回不为空的字符串。值表达式可以是任意的,包括其他 JSON 函数。
- 如果值为 `NULL`,则 `ON NULL` 行为定义了要执行的操作。如果省略,默认情况下假定为 `NULL ON NULL`。
-
- 请注意,键必须是唯一的。如果一个键出现多次,将抛出一个错误。
-
- 目前在 `OVER` windows 中不支持此函数。
-
- ```sql
- -- '{"Apple":2,"Banana":17,"Orange":0}'
- SELECT
- JSON_OBJECTAGG(KEY product VALUE cnt)
- FROM orders
- ```
- sql: JSON_ARRAY([value]* [ { NULL | ABSENT } ON NULL ])
table: jsonArray(JsonOnNull, values...)
description: |
@@ -984,21 +966,6 @@ json:
-- '[[1]]'
JSON_ARRAY(JSON_ARRAY(1))
```
- - sql: JSON_ARRAYAGG(items [ { NULL | ABSENT } ON NULL ])
- table: jsonArrayAgg(JsonOnNull, itemExpression)
- description: |
- 通过将字段聚合到数组中构建 JSON 对象字符串。
-
- 项目表达式可以是任意的,包括其他 JSON 函数。如果值为 `NULL`,则 `ON NULL`
行为定义了要执行的操作。如果省略,默认情况下假定为 `ABSENT ON NULL`。
-
- 此函数目前不支持 `OVER` windows、未绑定的 session windows 或 hop windows。
-
- ```sql
- -- '["Apple","Banana","Orange"]'
- SELECT
- JSON_ARRAYAGG(product)
- FROM orders
- ```
valueconstruction:
- sql: |
@@ -1139,6 +1106,39 @@ aggregate:
将窗口分区中的所有数据按照顺序划分为 n 个分组,返回分配给各行数据的分组编号(从 1 开始)。
如果不能均匀划分为 n 个分组,则从第 1 个分组开始,为每一分组分配一个剩余值。
比如某个窗口分区有 6 行数据,划分为 4 个分组,则各行的分组编号为:1,1,2,2,3,4。
+ - sql: JSON_OBJECTAGG([KEY] key VALUE value [ { NULL | ABSENT } ON NULL ])
+ table: jsonObjectAgg(JsonOnNull, keyExpression, valueExpression)
+ description: |
+ 通过将 key-value 聚合到单个 JSON 对象中,构建 JSON 对象字符串。
+
+ 键表达式必须返回不为空的字符串。值表达式可以是任意的,包括其他 JSON 函数。
+ 如果值为 `NULL`,则 `ON NULL` 行为定义了要执行的操作。如果省略,默认情况下假定为 `NULL ON NULL`。
+
+ 请注意,键必须是唯一的。如果一个键出现多次,将抛出一个错误。
+
+ 目前在 `OVER` windows 中不支持此函数。同时,此函数不支持与其他聚合函数一起使用。
+
+ ```sql
+ -- '{"Apple":2,"Banana":17,"Orange":0}'
+ SELECT
+ JSON_OBJECTAGG(KEY product VALUE cnt)
+ FROM orders
+ ```
+ - sql: JSON_ARRAYAGG(items [ { NULL | ABSENT } ON NULL ])
+ table: jsonArrayAgg(JsonOnNull, itemExpression)
+ description: |
+ 通过将字段聚合到数组中构建 JSON 对象字符串。
+
+ 项目表达式可以是任意的,包括其他 JSON 函数。如果值为 `NULL`,则 `ON NULL`
行为定义了要执行的操作。如果省略,默认情况下假定为 `ABSENT ON NULL`。
+
+ 此函数目前不支持 `OVER` windows、未绑定的 session windows 或 hop
windows。同时,此函数不支持与其他聚合函数一起使用。
+
+ ```sql
+ -- '["Apple","Banana","Orange"]'
+ SELECT
+ JSON_ARRAYAGG(product)
+ FROM orders
+ ```
catalog:
- sql: CURRENT_DATABASE()
diff --git
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.java
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.java
index 5836822a107..c19affe02a9 100644
---
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.java
+++
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.java
@@ -18,79 +18,105 @@
package org.apache.flink.table.planner.plan.rules.logical;
-import org.apache.flink.table.api.Schema;
import org.apache.flink.table.api.TableConfig;
-import org.apache.flink.table.api.TableDescriptor;
-import org.apache.flink.table.connector.ChangelogMode;
-import org.apache.flink.table.planner.factories.TableFactoryHarness;
-import org.apache.flink.table.planner.utils.StreamTableTestUtil;
import org.apache.flink.table.planner.utils.TableTestBase;
+import org.apache.flink.table.planner.utils.TableTestUtil;
import org.junit.Before;
import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
-import static org.apache.flink.table.api.DataTypes.INT;
-import static org.apache.flink.table.api.DataTypes.STRING;
+import java.util.Arrays;
+import java.util.Collection;
/** Tests for {@link WrapJsonAggFunctionArgumentsRule}. */
+@RunWith(Parameterized.class)
public class WrapJsonAggFunctionArgumentsRuleTest extends TableTestBase {
- private StreamTableTestUtil util;
+ private final boolean batchMode;
+ private TableTestUtil util;
+
+ @Parameterized.Parameters(name = "batchMode = {0}")
+ public static Collection<Boolean> data() {
+ return Arrays.asList(true, false);
+ }
+
+ public WrapJsonAggFunctionArgumentsRuleTest(boolean batchMode) {
+ this.batchMode = batchMode;
+ }
@Before
- public void before() {
- util = streamTestUtil(TableConfig.getDefault());
+ public void setup() {
+ if (batchMode) {
+ util = batchTestUtil(TableConfig.getDefault());
+ } else {
+ util = streamTestUtil(TableConfig.getDefault());
+ }
+
+ util.tableEnv()
+ .executeSql(
+ "CREATE TABLE T(\n"
+ + " f0 INTEGER,\n"
+ + " f1 VARCHAR,\n"
+ + " f2 BIGINT\n"
+ + ") WITH (\n"
+ + " 'connector' = 'values'\n"
+ + " ,'bounded' = '"
+ + batchMode
+ + "'\n)");
+
+ util.tableEnv()
+ .executeSql(
+ "CREATE TABLE T1(\n"
+ + " f0 INTEGER,\n"
+ + " f1 VARCHAR,\n"
+ + " f2 BIGINT\n"
+ + ") WITH (\n"
+ + " 'connector' = 'values'\n"
+ + " ,'bounded' = '"
+ + batchMode
+ + "'\n)");
}
@Test
public void testJsonObjectAgg() {
- final TableDescriptor sourceDescriptor =
- TableFactoryHarness.newBuilder()
- .schema(Schema.newBuilder().column("f0",
STRING()).build())
- .unboundedScanSource(ChangelogMode.all())
- .build();
-
- util.tableEnv().createTable("T", sourceDescriptor);
- util.verifyRelPlan("SELECT JSON_OBJECTAGG(f0 VALUE f0) FROM T");
+ util.verifyRelPlan("SELECT JSON_OBJECTAGG(f1 VALUE f1) FROM T");
}
@Test
public void testJsonObjectAggInGroupWindow() {
- final TableDescriptor sourceDescriptor =
- TableFactoryHarness.newBuilder()
- .schema(
- Schema.newBuilder()
- .column("f0", INT())
- .column("f1", STRING())
- .build())
- .unboundedScanSource()
- .build();
-
- util.tableEnv().createTable("T", sourceDescriptor);
util.verifyRelPlan("SELECT f0, JSON_OBJECTAGG(f1 VALUE f0) FROM T
GROUP BY f0");
}
@Test
public void testJsonArrayAgg() {
- final TableDescriptor sourceDescriptor =
- TableFactoryHarness.newBuilder()
- .schema(Schema.newBuilder().column("f0",
STRING()).build())
- .unboundedScanSource(ChangelogMode.all())
- .build();
-
- util.tableEnv().createTable("T", sourceDescriptor);
util.verifyRelPlan("SELECT JSON_ARRAYAGG(f0) FROM T");
}
@Test
public void testJsonArrayAggInGroupWindow() {
- final TableDescriptor sourceDescriptor =
- TableFactoryHarness.newBuilder()
- .schema(Schema.newBuilder().column("f0",
INT()).build())
- .unboundedScanSource()
- .build();
-
- util.tableEnv().createTable("T", sourceDescriptor);
util.verifyRelPlan("SELECT f0, JSON_ARRAYAGG(f0) FROM T GROUP BY f0");
}
+
+ @Test(expected = AssertionError.class)
+ public void testJsonObjectAggWithOtherAggs() {
+ util.verifyRelPlan("SELECT COUNT(*), JSON_OBJECTAGG(f1 VALUE f1) FROM
T");
+ }
+
+ @Test(expected = AssertionError.class)
+ public void testGroupJsonObjectAggWithOtherAggs() {
+ util.verifyRelPlan(
+ "SELECT f0, COUNT(*), JSON_OBJECTAGG(f1 VALUE f0), SUM(f2)
FROM T GROUP BY f0");
+ }
+
+ @Test(expected = AssertionError.class)
+ public void testJsonArrayAggWithOtherAggs() {
+ util.verifyRelPlan("SELECT COUNT(*), JSON_ARRAYAGG(f0) FROM T");
+ }
+
+ @Test(expected = AssertionError.class)
+ public void testGroupJsonArrayAggInWithOtherAggs() {
+ util.verifyRelPlan("SELECT f0, COUNT(*), JSON_ARRAYAGG(f0), SUM(f2)
FROM T GROUP BY f0");
+ }
}
diff --git
a/flink-table/flink-table-planner/src/test/resources/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.xml
b/flink-table/flink-table-planner/src/test/resources/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.xml
index 2b0b718d12a..b0f9f78efdb 100644
---
a/flink-table/flink-table-planner/src/test/resources/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.xml
+++
b/flink-table/flink-table-planner/src/test/resources/org/apache/flink/table/planner/plan/rules/logical/WrapJsonAggFunctionArgumentsRuleTest.xml
@@ -16,33 +16,55 @@ See the License for the specific language governing
permissions and
limitations under the License.
-->
<Root>
- <TestCase name="testJsonArrayAgg">
+ <TestCase name="testJsonArrayAgg[batchMode = false]">
<Resource name="sql">
<![CDATA[SELECT JSON_ARRAYAGG(f0) FROM T]]>
</Resource>
<Resource name="ast">
<![CDATA[
LogicalAggregate(group=[{}], EXPR$0=[JSON_ARRAYAGG_ABSENT_ON_NULL($0)])
-+- LogicalTableScan(table=[[default_catalog, default_database, T]])
++- LogicalProject(f0=[$0])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
]]>
</Resource>
<Resource name="optimized rel plan">
<![CDATA[
-GroupAggregate(select=[JSON_ARRAYAGG_ABSENT_ON_NULL_RETRACT($f1) AS EXPR$0])
+GroupAggregate(select=[JSON_ARRAYAGG_ABSENT_ON_NULL($f1) AS EXPR$0])
+- Exchange(distribution=[single])
+- Calc(select=[f0, JSON_STRING(f0) AS $f1])
- +- TableSourceScan(table=[[default_catalog, default_database, T]],
fields=[f0])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f0], metadata=[]]], fields=[f0])
]]>
</Resource>
</TestCase>
- <TestCase name="testJsonArrayAggInGroupWindow">
+ <TestCase name="testJsonArrayAgg[batchMode = true]">
+ <Resource name="sql">
+ <![CDATA[SELECT JSON_ARRAYAGG(f0) FROM T]]>
+ </Resource>
+ <Resource name="ast">
+ <![CDATA[
+LogicalAggregate(group=[{}], EXPR$0=[JSON_ARRAYAGG_ABSENT_ON_NULL($0)])
++- LogicalProject(f0=[$0])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
+]]>
+ </Resource>
+ <Resource name="optimized rel plan">
+ <![CDATA[
+SortAggregate(isMerge=[false], select=[JSON_ARRAYAGG_ABSENT_ON_NULL($f1) AS
EXPR$0])
++- Calc(select=[f0, JSON_STRING(f0) AS $f1])
+ +- Exchange(distribution=[single])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f0], metadata=[]]], fields=[f0])
+]]>
+ </Resource>
+ </TestCase>
+ <TestCase name="testJsonArrayAggInGroupWindow[batchMode = false]">
<Resource name="sql">
<![CDATA[SELECT f0, JSON_ARRAYAGG(f0) FROM T GROUP BY f0]]>
</Resource>
<Resource name="ast">
<![CDATA[
LogicalAggregate(group=[{0}], EXPR$1=[JSON_ARRAYAGG_ABSENT_ON_NULL($0)])
-+- LogicalTableScan(table=[[default_catalog, default_database, T]])
++- LogicalProject(f0=[$0])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
]]>
</Resource>
<Resource name="optimized rel plan">
@@ -50,45 +72,112 @@ LogicalAggregate(group=[{0}],
EXPR$1=[JSON_ARRAYAGG_ABSENT_ON_NULL($0)])
GroupAggregate(groupBy=[f0], select=[f0, JSON_ARRAYAGG_ABSENT_ON_NULL($f1) AS
EXPR$1])
+- Exchange(distribution=[hash[f0]])
+- Calc(select=[f0, JSON_STRING(f0) AS $f1])
- +- TableSourceScan(table=[[default_catalog, default_database, T]],
fields=[f0])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f0], metadata=[]]], fields=[f0])
+]]>
+ </Resource>
+ </TestCase>
+ <TestCase name="testJsonArrayAggInGroupWindow[batchMode = true]">
+ <Resource name="sql">
+ <![CDATA[SELECT f0, JSON_ARRAYAGG(f0) FROM T GROUP BY f0]]>
+ </Resource>
+ <Resource name="ast">
+ <![CDATA[
+LogicalAggregate(group=[{0}], EXPR$1=[JSON_ARRAYAGG_ABSENT_ON_NULL($0)])
++- LogicalProject(f0=[$0])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
+]]>
+ </Resource>
+ <Resource name="optimized rel plan">
+ <![CDATA[
+SortAggregate(isMerge=[false], groupBy=[f0], select=[f0,
JSON_ARRAYAGG_ABSENT_ON_NULL($f1) AS EXPR$1])
++- Calc(select=[f0, JSON_STRING(f0) AS $f1])
+ +- Sort(orderBy=[f0 ASC])
+ +- Exchange(distribution=[hash[f0]])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f0], metadata=[]]], fields=[f0])
]]>
</Resource>
</TestCase>
- <TestCase name="testJsonObjectAggInGroupWindow">
+ <TestCase name="testJsonObjectAgg[batchMode = false]">
+ <Resource name="sql">
+ <![CDATA[SELECT JSON_OBJECTAGG(f1 VALUE f1) FROM T]]>
+ </Resource>
+ <Resource name="ast">
+ <![CDATA[
+LogicalAggregate(group=[{}], EXPR$0=[JSON_OBJECTAGG_NULL_ON_NULL($0, $0)])
++- LogicalProject(f1=[$1])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
+]]>
+ </Resource>
+ <Resource name="optimized rel plan">
+ <![CDATA[
+GroupAggregate(select=[JSON_OBJECTAGG_NULL_ON_NULL($f1, $f1) AS EXPR$0])
++- Exchange(distribution=[single])
+ +- Calc(select=[f1, JSON_STRING(f1) AS $f1])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f1], metadata=[]]], fields=[f1])
+]]>
+ </Resource>
+ </TestCase>
+ <TestCase name="testJsonObjectAggInGroupWindow[batchMode = true]">
<Resource name="sql">
<![CDATA[SELECT f0, JSON_OBJECTAGG(f1 VALUE f0) FROM T GROUP BY f0]]>
</Resource>
<Resource name="ast">
<![CDATA[
LogicalAggregate(group=[{0}], EXPR$1=[JSON_OBJECTAGG_NULL_ON_NULL($1, $0)])
-+- LogicalTableScan(table=[[default_catalog, default_database, T]])
++- LogicalProject(f0=[$0], f1=[$1])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
]]>
</Resource>
<Resource name="optimized rel plan">
<![CDATA[
-GroupAggregate(groupBy=[f0], select=[f0, JSON_OBJECTAGG_NULL_ON_NULL(f1, $f2)
AS EXPR$1])
-+- Exchange(distribution=[hash[f0]])
- +- Calc(select=[f0, f1, JSON_STRING(f0) AS $f2])
- +- TableSourceScan(table=[[default_catalog, default_database, T]],
fields=[f0, f1])
+SortAggregate(isMerge=[true], groupBy=[f0], select=[f0,
Final_JSON_OBJECTAGG_NULL_ON_NULL(EXPR$1) AS EXPR$1])
++- Sort(orderBy=[f0 ASC])
+ +- Exchange(distribution=[hash[f0]])
+ +- LocalSortAggregate(groupBy=[f0], select=[f0,
Partial_JSON_OBJECTAGG_NULL_ON_NULL(f1, $f2) AS EXPR$1])
+ +- Calc(select=[f0, f1, JSON_STRING(f0) AS $f2])
+ +- Sort(orderBy=[f0 ASC])
+ +- TableSourceScan(table=[[default_catalog, default_database,
T, project=[f0, f1], metadata=[]]], fields=[f0, f1])
]]>
</Resource>
</TestCase>
- <TestCase name="testJsonObjectAgg">
+ <TestCase name="testJsonObjectAgg[batchMode = true]">
<Resource name="sql">
- <![CDATA[SELECT JSON_OBJECTAGG(f0 VALUE f0) FROM T]]>
+ <![CDATA[SELECT JSON_OBJECTAGG(f1 VALUE f1) FROM T]]>
</Resource>
<Resource name="ast">
<![CDATA[
LogicalAggregate(group=[{}], EXPR$0=[JSON_OBJECTAGG_NULL_ON_NULL($0, $0)])
-+- LogicalTableScan(table=[[default_catalog, default_database, T]])
++- LogicalProject(f1=[$1])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
]]>
</Resource>
<Resource name="optimized rel plan">
<![CDATA[
-GroupAggregate(select=[JSON_OBJECTAGG_NULL_ON_NULL_RETRACT($f1, $f1) AS
EXPR$0])
+SortAggregate(isMerge=[true],
select=[Final_JSON_OBJECTAGG_NULL_ON_NULL(EXPR$0) AS EXPR$0])
+- Exchange(distribution=[single])
- +- Calc(select=[f0, JSON_STRING(f0) AS $f1])
- +- TableSourceScan(table=[[default_catalog, default_database, T]],
fields=[f0])
+ +- LocalSortAggregate(select=[Partial_JSON_OBJECTAGG_NULL_ON_NULL($f1, $f1)
AS EXPR$0])
+ +- Calc(select=[f1, JSON_STRING(f1) AS $f1])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f1], metadata=[]]], fields=[f1])
+]]>
+ </Resource>
+ </TestCase>
+ <TestCase name="testJsonObjectAggInGroupWindow[batchMode = false]">
+ <Resource name="sql">
+ <![CDATA[SELECT f0, JSON_OBJECTAGG(f1 VALUE f0) FROM T GROUP BY f0]]>
+ </Resource>
+ <Resource name="ast">
+ <![CDATA[
+LogicalAggregate(group=[{0}], EXPR$1=[JSON_OBJECTAGG_NULL_ON_NULL($1, $0)])
++- LogicalProject(f0=[$0], f1=[$1])
+ +- LogicalTableScan(table=[[default_catalog, default_database, T]])
+]]>
+ </Resource>
+ <Resource name="optimized rel plan">
+ <![CDATA[
+GroupAggregate(groupBy=[f0], select=[f0, JSON_OBJECTAGG_NULL_ON_NULL(f1, $f2)
AS EXPR$1])
++- Exchange(distribution=[hash[f0]])
+ +- Calc(select=[f0, f1, JSON_STRING(f0) AS $f2])
+ +- TableSourceScan(table=[[default_catalog, default_database, T,
project=[f0, f1], metadata=[]]], fields=[f0, f1])
]]>
</Resource>
</TestCase>