liuyongvs opened a new pull request, #22917:
URL: https://github.com/apache/flink/pull/22917
the return type is not right. it should always return nullable.
because we don't know the value whether is null in compile procedure, only
can know in the runtime.
for example:
ddl :
```
CREATE TABLE data_source (
a array<int not null> not null
) WITH (
'connector'='xxx',
);
```
// the element is array(), that is to say, empty array. the result is null.
// if the return type is int not null, how to save null value.
select array_max(a) from data_source;
```
spark
case class ArrayMax(child: Expression)
extends UnaryExpression with ImplicitCastInputTypes with NullIntolerant {
override def nullable: Boolean = true
@transient override lazy val dataType: DataType = child.dataType match {
case ArrayType(dt, _) => dt
case _ => throw new IllegalStateException(s"$prettyName accepts only
arrays.")
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]