Jean-Marc, I think you mis understood. At run time, you can query HBase to find out the table schema and its column families.
While I agree that you are seeing poorly written exceptions, IMHO its easier to avoid the problem in the first place. In a Map/Reduce in side the mapper class, you have everything you need to get the table's schema. From that you can see the column families. HTH -Mike On Jul 9, 2012, at 8:42 AM, Jean-Marc Spaggiari wrote: > In my case it was a codding issue. Used the wrong final byte array to > access the CF. So I agree, the CF is well known since you create the > table based on them. But maybe you have added some other CFs later and > something went wrong? > > It's just that based on the exception received, there is no indication > that there might be some issues with the CF. So you might end trying > to figure what the issue is far from where it's really. > > 2012/7/9, Michael Segel <[email protected]>: >> This may beg the question ... >> Why do you not know the CF? >> >> Your table schemas only consist of tables and CFs. So you should know them >> at the start of your job or m/r Mapper.setup(); >> >> >> On Jul 9, 2012, at 7:25 AM, Jean-Marc Spaggiari wrote: >> >>> Hi, >>> >>> When we try to add a value to a CF which does not exist on a table, we >>> are getting the error below. I think this is not really giving the >>> right information about the issue. >>> >>> Should it not be better to provide an exception like >>> UnknownColumnFamillyException? >>> >>> JM >>> >>> org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: >>> Failed 1 action: DoNotRetryIOException: 1 time, servers with issues: >>> phenom:60020, >>> at >>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1591) >>> at >>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1367) >>> at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:945) >>> at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:801) >>> at org.apache.hadoop.hbase.client.HTable.put(HTable.java:776) >>> at org.myapp.app.Integrator.main(Integrator.java:162) >>> >> >> >
