izchen commented on pull request #3460:
URL: https://github.com/apache/iceberg/pull/3460#issuecomment-970628383
> Eventually, we'll need to update more than just Spark v3.2. Additionally,
it would be nice if Flink and even MR could be handled in this PR.
### Flink
Executing this SQL test case in flink will report an error:
```
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.api.ValidationException: Data type 'BINARY(2) NOT
NULL' with conversion class '[B' does not support a value literal of class
'org.apache.calcite.avatica.util.ByteString'.
```
It seems that this is an internal error of flink, which has nothing to do
with iceberg. I can add the following code to the flink code to solve this
problem (I will try to submit a PR to the flink community in the future):
```scala
case BINARY =>
// convert to byte[]
literal.getValueAs(classOf[Array[Byte]])
```
https://github.com/apache/flink/blob/44378fa5fde6c17e1712a62b834cb6251605f416/flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/planner/plan/utils/RexNodeExtractor.scala#L400-L466
After using this code to fix flink, the SQL test case runs normally, returns
the correct result, and the SQL description string in the flink UI is the
correct X'110F'. I think this problem does not exist in Flink-runtime.
### Hive
According to the hive documentation, there is no binary type literal in
hive. I think this problem does not exist in Hive-runtime.
### Trino
For Trino, I have not actually used Trino. Judging from the Trino code,
Trino may not have this MR problem, but I do not have a Trino test environment.
And we cannot modify the Trino-runtime code in this MR.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]