[ 
https://issues.apache.org/jira/browse/FLINK-25482?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17466503#comment-17466503
 ] 

miaojianlong commented on FLINK-25482:
--------------------------------------

Please assign the order to me

> Hive Lookup Join with decimal type failed
> -----------------------------------------
>
>                 Key: FLINK-25482
>                 URL: https://issues.apache.org/jira/browse/FLINK-25482
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive
>    Affects Versions: 1.12.3, 1.13.5, 1.14.2
>            Reporter: miaojianlong
>            Priority: Critical
>
> Hive Lookup Join with decimal type failed.
> Error exception place :
> {code:java}
> public BigInteger(byte[] val) {
> if (val.length == 0)
> throw new NumberFormatException("Zero length BigInteger");{code}
> Because of this 
> code:org.apache.flink.connectors.hive.read.HiveInputFormatPartitionReader#hasNext
> {code:java}
> private boolean hasNext() throws IOException {
>     if (inputSplits.length > 0) {
>         if (hiveTableInputFormat.reachedEnd() && readingSplitId == 
> inputSplits.length - 1) {
>             return false;
>         } else if (hiveTableInputFormat.reachedEnd()) {
>             readingSplitId++;
>             hiveTableInputFormat.open(inputSplits[readingSplitId]);
>         }
>         return true;
>     }
>     return false;
> }{code}
> when we have more than one file in hive table and when we reachedEnd of the 
> first inputSplits.Then at this time it will open the next inputSplit,but it 
> did not trigger the batch reading of the file.Then after calling readRecord 
> later, the data read is empty.For decimal type, it will trigger The above 
> exception.In other scenarios, it will read more empty row.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to