mihaibudiu commented on code in PR #4290:
URL: https://github.com/apache/calcite/pull/4290#discussion_r2037927704
##########
core/src/main/java/org/apache/calcite/sql/dialect/ClickHouseSqlDialect.java:
##########
@@ -216,6 +216,24 @@ private static SqlDataTypeSpec
createSqlDataTypeSpecByName(String typeAlias,
}
switch (call.getKind()) {
+ case MAP_VALUE_CONSTRUCTOR:
+ writer.print(call.getOperator().getName().toLowerCase(Locale.ROOT));
+ final SqlWriter.Frame mapFrame = writer.startList("(", ")");
+ for (int i = 0; i < call.operandCount(); i++) {
+ writer.sep(",");
+ call.operand(i).unparse(writer, leftPrec, rightPrec);
+ }
+ writer.endList(mapFrame);
+ break;
+ case ARRAY_VALUE_CONSTRUCTOR:
+ writer.print(call.getOperator().getName().toLowerCase(Locale.ROOT));
Review Comment:
You are using the ARRAY constructor in your example, which is not a Spark
feature.
The `ARRAY()` function works without arguments.
There are lots of tests using `array()` in the codebase.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]