I am trying out hbase on my local machine. I ran out of file handles while loading Wikipedia pages into a table as a test:

2009-07-13 18:59:20,223 FATAL org.apache.hadoop.hbase.regionserver.MemcacheFlusher: Replay of hlog required. Forcing server shutdown org.apache.hadoop.hbase.DroppedSnapshotException: region: enwiki0903,Port Vila,1247474149864 at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:903) at org.apache.hadoop.hbase.regionserver.HRegion.flushcache(HRegion.java:796) at org.apache.hadoop.hbase.regionserver.MemcacheFlusher.flushRegion(MemcacheFlusher.java:265) at org.apache.hadoop.hbase.regionserver.MemcacheFlusher.run(MemcacheFlusher.java:148) Caused by: java.io.FileNotFoundException: /home/joel/hbase-root/hbase-joel/hbase/enwiki0903/843294683/expanded/mapfiles/5105714541107922778/data (Too many open files)
        at java.io.FileOutputStream.open(Native Method)


I could not find any documentation on how to manually replay the hlog.

It doesn't seem to have done automatically so when I shutdown and restarted the server.

Getting a row count of the table I had been loading data into gave me no results. But I have 31GB of data in my hbase data directory, many of which are oldlogfile.log.

How do I recover the data stored there? Why is the recovery not automatic?

Thanks,

- Joel

Reply via email to