Indeed, it's related to both ;) You have the exception with is fixed by 7467 and the size of the logs which is fixed by 7214.
Also, I will recommand you to disable the DEBUG level for this class if you can't migrate to 0.94.4 JM 2013/1/27, Ted Yu <[email protected]>: > This should have been fixed by HBASE-7214 which is in 0.94.4 > > Cheers > > On Sun, Jan 27, 2013 at 5:51 PM, Robert Dyer <[email protected]> wrote: > >> I noticed my HBase 0.94.3 master log was growing extremely large, then >> saw >> this repeated over and over. Any ideas what is wrong here? Should I >> just >> remove the /hbase/.archive directory for this table? >> >> (note: I changed sensitive host/table names, and the 'checking directory' >> line seems to exist once per region, omitting most here) >> >> 2013-01-27 19:44:52,337 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/efffadf06a01127d476e378b8c9b73ce >> 2013-01-27 19:44:52,337 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f013601d853d9dc0c4d8d951843cae7d >> 2013-01-27 19:44:52,338 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f49e788d91e487ccaa5e772c49f48fd3 >> 2013-01-27 19:44:52,338 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f58e3c8b336b692100a139b5d73f702a >> 2013-01-27 19:44:52,338 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f6acbb1629c7646ce2d1181e92d2a0ad >> 2013-01-27 19:44:52,339 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f8b28256870ce1ec011ed679d5348788 >> 2013-01-27 19:44:52,339 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fce83dceecc91452e11e1c75b8454145 >> 2013-01-27 19:44:52,339 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fed2d047b5fb8684db6faa7ccc2be85d >> 2013-01-27 19:44:52,340 DEBUG >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory: >> >> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/ff811b3e22145d22c2341bcaaadb0e4a >> 2013-01-27 19:44:52,341 WARN >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Error while cleaning >> the logs >> java.io.IOException: java.io.IOException: /hbase/.archive/XXXTABLEXXX is >> non empty >> at >> >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1972) >> at >> org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:792) >> at sun.reflect.GeneratedMethodAccessor488.invoke(Unknown Source) >> at >> >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:601) >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:415) >> at >> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382) >> >> at sun.reflect.GeneratedConstructorAccessor15.newInstance(Unknown >> Source) >> at >> >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >> at >> java.lang.reflect.Constructor.newInstance(Constructor.java:525) >> at >> >> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96) >> at >> >> org.apache.hadoop.hbase.RemoteExceptionHandler.checkThrowable(RemoteExceptionHandler.java:48) >> at >> >> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(RemoteExceptionHandler.java:66) >> at >> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:124) >> at org.apache.hadoop.hbase.Chore.run(Chore.java:67) >> at java.lang.Thread.run(Thread.java:722) >> >
