zachjsh commented on code in PR #13627:
URL: https://github.com/apache/druid/pull/13627#discussion_r1067650355
##########
sql/src/main/java/org/apache/druid/sql/calcite/external/Externals.java:
##########
@@ -145,49 +174,41 @@ public Consistency getConsistency()
}
/**
- * Convert the actual arguments to SQL external table function into a catalog
- * resolved table, then convert that to an external table spec usable by MSQ.
- *
- * @param tableDefn catalog definition of the kind of external table
- * @param parameters the parameters to the SQL table macro
- * @param arguments the arguments that match the parameters. Optional
arguments
- * may be null
- * @param schema the external table schema provided by the EXTEND clause
- * @param jsonMapper the JSON mapper to use for value conversions
- * @return a spec with the three values that MSQ needs to create an external
table
+ * Convert the list of Calcite function arguments to a map of non-null
arguments.
+ * The resulting map must be mutable as processing may rewrite values.
*/
- public static ExternalTableSpec convertArguments(
- final ExternalTableDefn tableDefn,
- final List<FunctionParameter> parameters,
- final List<Object> arguments,
- final SqlNodeList schema,
- final ObjectMapper jsonMapper
+ public static Map<String, Object> convertArguments(
+ final TableFunction fn,
+ final List<Object> arguments
)
{
- final TableBuilder builder = TableBuilder.of(tableDefn);
- for (int i = 0; i < parameters.size(); i++) {
- String name = parameters.get(i).getName();
- Object value = arguments.get(i);
- if (value == null) {
- continue;
+ final List<TableFunction.ParameterDefn> params = fn.parameters();
+ final Map<String, Object> argMap = new HashMap<>();
+ for (int i = 0; i < arguments.size(); i++) {
+ final Object value = arguments.get(i);
+ if (value != null) {
+ argMap.put(params.get(i).name(), value);
}
- PropertyDefn<?> prop = tableDefn.property(name);
- builder.property(name, prop.decodeSqlValue(value, jsonMapper));
}
+ return argMap;
+ }
- // Converts from a list of (identifier, type, ...) pairs to
- // a Druid row signature. The schema itself comes from the
- // Druid-specific EXTEND syntax added to the parser.
+ /**
+ * Converts from a list of (identifier, type, ...) pairs to
+ * list of column specs. The schema itself comes from the
+ * Druid-specific EXTEND syntax added to the parser.
+ */
+ public static List<ColumnSpec> convertColumns(SqlNodeList schema)
+ {
+ final List<ColumnSpec> columns = new ArrayList<>();
for (int i = 0; i < schema.size(); i += 2) {
final String name = convertName((SqlIdentifier) schema.get(i));
- String sqlType = convertType(name, (SqlDataTypeSpec) schema.get(i + 1));
- builder.column(name, sqlType);
+ final String sqlType = convertType(name, (SqlDataTypeSpec) schema.get(i
+ 1));
+ columns.add(new ColumnSpec(name, sqlType, null));
Review Comment:
since the type can be null now, what does this mean for iterating over the
nodeList, is the user expected to put some placeholder identifier to indicate
that the type is null / not specified, and if not, will iterating every 2 nodes
as were doing here be problematic in this case?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]