i was able to solve the issue deleting hbase folder in hdfs with "hdfs dfs
-rm -r /hbase" and restarting hbase.
now app creation in pio is working again.

I still wonder why this problem happen though, i'm running hbase in
pseudo-distributed mode (for testing purposes everything, from spark to
hadoop, is in a single machine), could a problem for prediction in managing
the apps?

2018-05-29 13:47 GMT+02:00 Marco Goldin <markomar...@gmail.com>:

> Hi all, i deleted all old apps from prediction (currently running 0.12.0)
> but when i'm creating a new one i get this error from hbase.
> I inspected hbase from shell but there aren't any table inside.
>
>
> ```
>
> pio app new mlolur
>
> [INFO] [HBLEvents] The table pio_event:events_1 doesn't exist yet.
> Creating now...
>
> Exception in thread "main" org.apache.hadoop.hbase.TableExistsException:
> org.apache.hadoop.hbase.TableExistsException: pio_event:events_1
>
> at org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.
> prepareCreate(CreateTableProcedure.java:299)
>
> at org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.
> executeFromState(CreateTableProcedure.java:106)
>
> at org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.
> executeFromState(CreateTableProcedure.java:58)
>
> at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(
> StateMachineProcedure.java:119)
>
> at org.apache.hadoop.hbase.procedure2.Procedure.
> doExecute(Procedure.java:498)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(
> ProcedureExecutor.java:1147)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.
> execLoop(ProcedureExecutor.java:942)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.
> execLoop(ProcedureExecutor.java:895)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.
> access$400(ProcedureExecutor.java:77)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$
> 2.run(ProcedureExecutor.java:497)
>
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
> at org.apache.hadoop.ipc.RemoteException.instantiateException(
> RemoteException.java:106)
>
> at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(
> RemoteException.java:95)
>
> at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(
> RpcRetryingCaller.java:209)
>
> at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(
> RpcRetryingCaller.java:223)
>
> at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> RpcRetryingCaller.java:121)
>
> at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> RpcRetryingCaller.java:90)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.
> java:3347)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsync(HBaseAdmin.
> java:603)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(
> HBaseAdmin.java:494)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(
> HBaseAdmin.java:428)
>
> at org.apache.predictionio.data.storage.hbase.HBLEvents.init(
> HBLEvents.scala:66)
>
> at org.apache.predictionio.tools.commands.App$$anonfun$create$
> 4$$anonfun$apply$5.apply(App.scala:63)
>
> at org.apache.predictionio.tools.commands.App$$anonfun$create$
> 4$$anonfun$apply$5.apply(App.scala:62)
>
> at scala.Option.map(Option.scala:146)
>
> at org.apache.predictionio.tools.commands.App$$anonfun$create$
> 4.apply(App.scala:62)
>
> at org.apache.predictionio.tools.commands.App$$anonfun$create$
> 4.apply(App.scala:55)
>
> at scala.Option.getOrElse(Option.scala:121)
>
> at org.apache.predictionio.tools.commands.App$.create(App.scala:55)
>
> at org.apache.predictionio.tools.console.Pio$App$.create(Pio.scala:173)
>
> at org.apache.predictionio.tools.console.Console$$anonfun$main$
> 1.apply(Console.scala:726)
>
> at org.apache.predictionio.tools.console.Console$$anonfun$main$
> 1.apply(Console.scala:656)
>
> at scala.Option.map(Option.scala:146)
>
> at org.apache.predictionio.tools.console.Console$.main(Console.scala:656)
>
> at org.apache.predictionio.tools.console.Console.main(Console.scala)
>
> Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.
> apache.hadoop.hbase.TableExistsException): 
> org.apache.hadoop.hbase.TableExistsException:
> pio_event:events_1
>
> at org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.
> prepareCreate(CreateTableProcedure.java:299)
>
> at org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.
> executeFromState(CreateTableProcedure.java:106)
>
> at org.apache.hadoop.hbase.master.procedure.CreateTableProcedure.
> executeFromState(CreateTableProcedure.java:58)
>
> at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(
> StateMachineProcedure.java:119)
>
> at org.apache.hadoop.hbase.procedure2.Procedure.
> doExecute(Procedure.java:498)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(
> ProcedureExecutor.java:1147)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.
> execLoop(ProcedureExecutor.java:942)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.
> execLoop(ProcedureExecutor.java:895)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.
> access$400(ProcedureExecutor.java:77)
>
> at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$
> 2.run(ProcedureExecutor.java:497)
>
>
> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1457)
>
> at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(
> RpcClient.java:1661)
>
> at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementati
> on.callBlockingMethod(RpcClient.java:1719)
>
> at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$
> BlockingStub.createTable(MasterProtos.java:42717)
>
> at org.apache.hadoop.hbase.client.HConnectionManager$
> HConnectionImplementation$5.createTable(HConnectionManager.java:1964)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin$2.call(HBaseAdmin.java:607)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin$2.call(HBaseAdmin.java:603)
>
> at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> RpcRetryingCaller.java:114)
>
> ... 19 more
>
> ´´´
>
> The strange thing is that when i call pio app list i get this:
>
> ```
>
> aml@ip-172-31-79-43:~$ pio app list
>
> [INFO] [Pio$]                 Name |   ID |
>                         Access Key | Allowed Event(s)
>
> [INFO] [Pio$] Finished listing 1 app(s).
> ´´´
> "Finished listing 1 app(s)" without any details, is it normal?
>
> Any idea of what could have caused this?
> thanks
> Marco Goldin
>

Reply via email to