fsk119 commented on code in PR #19218:
URL: https://github.com/apache/flink/pull/19218#discussion_r910805081
##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/SqlFunctionConverter.java:
##########
@@ -97,12 +97,14 @@ public RexNode visitCall(RexCall call) {
((HiveParser.HiveParserSessionState)
SessionState.get())
.getHiveParserCurrentTS();
HiveShim hiveShim = HiveParserUtils.getSessionHiveShim();
- try {
- return HiveParserRexNodeConverter.convertConstant(
- new
ExprNodeConstantDesc(hiveShim.toHiveTimestamp(currentTS)), cluster);
- } catch (SemanticException e) {
- throw new FlinkHiveException(e);
- }
+ return convertToLiteral(hiveShim.toHiveTimestamp(currentTS));
Review Comment:
It seems we can use
```
if (convertedOp instanceof SqlCastFunction) {
} else if (convertedOp instanceof FlinkSqlTimestampFunction) {
} else if (convertedOp.getName().equals("current_database")) {
} else {
return builder.makeCall(convertedOp, visitList(operands,
update));
}
```
##########
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/HiveDialectQueryITCase.java:
##########
@@ -330,6 +330,22 @@ public void testJoinInvolvingComplexType() throws
Exception {
}
}
+ @Test
+ public void testCurrentDatabase() {
+ List<Row> result =
+ CollectionUtil.iteratorToList(
+ tableEnv.executeSql("select
current_database()").collect());
+ assertThat(result.toString()).isEqualTo("[+I[default]]");
+ tableEnv.executeSql("create database db1");
+ tableEnv.executeSql("use db1");
+ result =
+ CollectionUtil.iteratorToList(
+ tableEnv.executeSql("select
current_database()").collect());
+ assertThat(result.toString()).isEqualTo("[+I[db1]]");
+ // switch to default database for following test use default database
+ tableEnv.executeSql("use default");
Review Comment:
It's better if we can use
```
@After
public void cleanup() {
tableEnv.executeSql("use default");
}
```
##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/SqlFunctionConverter.java:
##########
@@ -97,12 +97,14 @@ public RexNode visitCall(RexCall call) {
((HiveParser.HiveParserSessionState)
SessionState.get())
.getHiveParserCurrentTS();
HiveShim hiveShim = HiveParserUtils.getSessionHiveShim();
- try {
- return HiveParserRexNodeConverter.convertConstant(
- new
ExprNodeConstantDesc(hiveShim.toHiveTimestamp(currentTS)), cluster);
- } catch (SemanticException e) {
- throw new FlinkHiveException(e);
- }
+ return convertToLiteral(hiveShim.toHiveTimestamp(currentTS));
+ } else if (convertedOp.getName().equals("current_database")) {
Review Comment:
I think it's very hacked to fix this problem in this way. Currently, we
determine which function is used by comparing the method name, parameters, and
return type. But here we only consider the function name.
I think we should introduce a class similar to `SqlFunctionCast` here and
use `instanceOf` to determine.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]