hehuiyuan commented on code in PR #19423:
URL: https://github.com/apache/flink/pull/19423#discussion_r848489334
##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/functions/hive/HiveGenericUDAF.java:
##########
@@ -173,6 +178,14 @@ public GenericUDAFEvaluator.AggregationBuffer
createAccumulator() {
public void accumulate(GenericUDAFEvaluator.AggregationBuffer acc,
Object... inputs)
throws HiveException {
+ // When the parameter of the function is (Integer, Array[Double]),
Flink calls
+ // udf.eval(AggregationBuffer, Integer, Array[Double]), which is not a
problem.
+ // But when the parameter is a single array, Flink calls
udf.accumulate(AggregationBuffer,
+ // Array[Double]), at this point java's var-args will cast
Array[Double] to Array[Object]
+ // and let it be Object... args, So we need wrap it.
+ if (isArgsSingleArray) {
+ inputs = new Object[] {inputs};
+ }
if (!allIdentityConverter) {
for (int i = 0; i < inputs.length; i++) {
Review Comment:
I think there are some quetion. conversrsions length is 1,but inputs length
is not when argsType is array(double)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]