[
https://issues.apache.org/jira/browse/FLINK-37212?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Ferenc Csaky closed FLINK-37212.
--------------------------------
Resolution: Fixed
{{[ca5a4aa|https://github.com/apache/flink-connector-hbase/commit/ca5a4aae3e953841371e9a568ad8472a5ef8c1d2]}}
in main
> When hbase region move, throw "Unable to load exception received from
> server: XXX"
> -----------------------------------------------------------------------------------
>
> Key: FLINK-37212
> URL: https://issues.apache.org/jira/browse/FLINK-37212
> Project: Flink
> Issue Type: Bug
> Components: Connectors / HBase
> Affects Versions: hbase-4.0.0
> Reporter: jonasjc
> Assignee: jonasjc
> Priority: Major
> Labels: pull-request-available
> Fix For: hbase-4.1.0
>
>
> *Problem:*
> when hbase region move,hbase connector throw the following exception:
> {code:java}
> Caused by:
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.DoNotRetryIOException:
> Unable to load exception received from
> server:org.apache.hadoop.hbase.NotServingRegionException at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.unwrapRemoteException(RemoteWithExtrasException.java:85)
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.protobuf.ProtobufUtil.makeIOExceptionOfException(ProtobufUtil.java:282)
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.protobuf.ProtobufUtil.handleRemoteException(ProtobufUtil.java:269)
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.RegionServerCallable.call(RegionServerCallable.java:129)
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.HTable.get(HTable.java:384)
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.HTable.get(HTable.java:358)
> at
> org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.lookup(HBaseRowDataLookupFunction.java:98)
> ... 18 moreCaused by:
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.NotServingRegionException):
> org.apache.hadoop.hbase.NotServingRegionException:
> DIM:jichun_test_dim,,1736922383279.ebbf1db8150deec82662ff4087dbcc67. is not
> online on xxx at
> org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3462)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3439)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1488)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2561)
> at
> org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:45815)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:392) at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:133) at
> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:359)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:339)
> {code}
> and
> {code:java}
> 2025-01-24 11:11:41,930 [Legacy Source Thread - Source:
> TableSourceScan(table=[[default_catalog, default_database, datagen_test]],
> fields=[c1]) ->
> LookupJoin(table=[default_catalog.default_database.dim_kefu_ticket_detail],
> joinType=[LeftOuterJoin], lookup=[rowkey=c1], select=[c1, rowkey]) -> Sink:
> Sink(table=[default_catalog.default_database.sink_test], fields=[c1, rowkey])
> (1/1)#0] [] ERROR
> org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction(105) -
> HBase lookup error, retry times =
> 0org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.DoNotRetryIOException:
> Unable to load exception received from
> server:org.apache.hadoop.hbase.exceptions.RegionMovedException at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.unwrapRemoteException(RemoteWithExtrasException.java:85)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.protobuf.ProtobufUtil.makeIOExceptionOfException(ProtobufUtil.java:282)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.protobuf.ProtobufUtil.handleRemoteException(ProtobufUtil.java:269)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.RegionServerCallable.call(RegionServerCallable.java:129)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.HTable.get(HTable.java:384)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.client.HTable.get(HTable.java:358)
>
> ~[blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.lookup(HBaseRowDataLookupFunction.java:98)
>
> [blob_p-05917e94670b1c6e110fce2c94b45405cea035d1-9465ef6bc8090a3af4d68959bb598539:4.0-SNAPSHOT]
> at
> org.apache.flink.table.functions.LookupFunction.eval(LookupFunction.java:52)
> [flink-table-api-java-uber-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> LookupFunction$6.flatMap(Unknown Source)
> [flink-table-runtime-du-1.16-SNAPSHOT.jar:?] at
> org.apache.flink.table.runtime.operators.join.lookup.LookupJoinRunner.doFetch(LookupJoinRunner.java:92)
> [flink-table-runtime-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.table.runtime.operators.join.lookup.LookupJoinRunner.processElement(LookupJoinRunner.java:79)
> [flink-table-runtime-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.table.runtime.operators.join.lookup.LookupJoinRunner.processElement(LookupJoinRunner.java:34)
> [flink-table-runtime-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.ProcessOperator.processElement(ProcessOperator.java:66)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.runtime.tasks.ChainingOutput.pushToOperator(ChainingOutput.java:99)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.runtime.tasks.ChainingOutput.collect(ChainingOutput.java:80)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.runtime.tasks.ChainingOutput.collect(ChainingOutput.java:39)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:418)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:513)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:103)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource.run(DataGeneratorSource.java:117)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT] at
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:333)
> [flink-dist-du-1.16-SNAPSHOT.jar:du-1.16-SNAPSHOT]Caused by:
> org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.ipc.RemoteWithExtrasException:
> org.apache.hadoop.hbase.exceptions.RegionMovedException: Region moved to:
> hostname=xxx port=16020 startCode=1736822871845. As of locationSeqNum=54.
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3453)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3439)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1488)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2561)
> at
> org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:45815)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:392) at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:133) at
> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:359)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:339)
> {code}
> *Cause:*
> it cause by : 'org.apache.hadoop.hbase.NotServingRegionException' had been
> relocate to
> 'org.apache.flink.hbase.shaded.org.apache.hadoop.hbase.NotServingRegionException'
> and can not fond the class
> 'org.apache.hadoop.hbase.NotServingRegionException' from error message fo
> hbase sever. and the same as
> 'org.apache.hadoop.hbase.exceptions.RegionMovedException'
>
> *Fix:*
> don't relocate class 'org.apache.hadoop.hbase.NotServingRegionException' and
> 'org.apache.hadoop.hbase.exceptions.RegionMovedException'
--
This message was sent by Atlassian Jira
(v8.20.10#820010)