Sorry for the issue you've hit, Yang Zhang. You may need to do the following to recover: - drop the table from the hbase shell - create a snapshot of the SYSTEM.CATALOG table just in case - delete the rows for the table from the SYSTEM.CATALOG table (i.e. issue a DELETE FROM SYSTEM.CATALOG WHERE TABLE_SCHEM = <your schema name> AND TABLE_NAME = <your table name>) - bounce your cluster (since the SYSTEM.CATALOG table is cached)
On Tue, Sep 6, 2016 at 5:56 PM, Yang Zhang <zhang.yang...@gmail.com> wrote: > it got the DoNotRetryIOException again, > > This time i just try to creat a table, When i Try to drop the table ,I got > this Exception, > Here are the Exception stack below > > Error: org.apache.hadoop.hbase.DoNotRetryIOException: MAGNETISM_MODEL: 6 > at org.apache.phoenix.util.ServerUtil.createIOException( > ServerUtil.java:84) > at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable( > MetaDataEndpointImpl.java:1298) > at org.apache.phoenix.coprocessor.generated.MetaDataProtos$ > MetaDataService.callMethod(MetaDataProtos.java:10525) > at org.apache.hadoop.hbase.regionserver.HRegion. > execService(HRegion.java:6864) > at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion( > HRegionServer.java:3415) > at org.apache.hadoop.hbase.regionserver.HRegionServer. > execService(HRegionServer.java:3397) > at org.apache.hadoop.hbase.protobuf.generated. > ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29998) > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2078) > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108) > at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop( > RpcExecutor.java:114) > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.ArrayIndexOutOfBoundsException: 6 > at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354) > at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276) > at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265) > at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable( > MetaDataEndpointImpl.java:811) > at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable( > MetaDataEndpointImpl.java:448) > at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doDropTable( > MetaDataEndpointImpl.java:1318) > at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable( > MetaDataEndpointImpl.java:1272) > ... 10 more > > SQLState: 08000 > ErrorCode: 101 > > > I still can't handle this problem, Any one can help me ? > > Thank you very much > > 2016-07-23 10:34 GMT+08:00 Yang Zhang <zhang.yang...@gmail.com>: > >> Hello everyone >> >> I got an (org.apache.hadoop.hbase.DoNotRetryIOException) when >> using phoenix. my version is phoenix-4.4.0-HBase-0.98-bin. >> I create a table and upsert data into it. But someday i want to modify >> it, so i execute (alter table drop column c1)and ( alter table add c2 >> bigint). after that i upsert new data into my table for each row. I try >> select * from my table,it got success。 >> >> But afther a time, I try to select * from it, I got >> DoNotRetryIOException >> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84) >> and java .lang.ArrayIndexOutOfBoundsException:10 at >> org.apache.phoenix.schema.PtableImpl.init(PtableImpl.java:354)。 >> >> >> When I try to drop the table I still got DoNotRetryException >> Exception, Btw anyone have try drop phoenix table in hbase shell ? I >> should find some way to drop the table. >> >> >> >> Thanks very much! >> >> >> >