Hi Larry,

If you shut down HBase *and* HDFS, and then restart them both, does
that clear the problem? 

Are you running the Hadoop daemons (including DFS) under a user account such as 
"hadoop" or similar? Have you increased the open files limit (nofiles in 
/etc/security/limits.conf on RedHat style systems) for that user from the 
default of 1024 to something substantially larger (I use 32768)? 

Have you adjusted the HDFS configuration as suggested at 
http://wiki.apache.org/hadoop/Hbase/Troubleshooting , items 5 and 6?

   - Andy

> From: Larry Compton
> Subject: java.io.IOException: Could not obtain block
> 2009-01-29 13:07:50,439 WARN
> org.apache.hadoop.hdfs.DFSClient: DFS Read:
> java.io.IOException: Could not obtain block:
> blk_2439003473799601954_58348
> file=/hbase/-ROOT-/70236052/info/mapfiles/2587717070724571438/data
[...]
> 
> Hadoop 0.19.0
> HBase 0.19.0
[...]



      

Reply via email to