Hi,
I am getting a lot of these RetriesExhaustedExceptions when I run my m/r
job. This happens with the 116 server only. What could be the issue? I have
checked that RS is running on that server, and 192.168.1.116:60030 is also
working fine..
org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact
region server 192.168.1.116:60020 for region
Webevent,de6c33d0-4e17-47e5-af8a-f88f0af32235_1273198490000_a53c83e4-7a80-418c-bc99-f2f955bda9b2,1289462602425,
row
'e8f3e3c3-606e-4d1b-a84f-94c5421d153f_1273198296000_23717002-51e3-48e8-9fa4-7618e9728b93',
but failed after 10 attempts.
Exceptions:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting up
proxy to /192.168.1.116:60020 after attempts=1
at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getRegionServerWithRetries(HConnectionManager.java:1045)
at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers$3.doCall(HConnectionManager.java:1230)
at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers$Batch.process(HConnectionManager.java:1152)
at
org.apache.hadoop.hbase.client.HConnectionManager$TableServers.processBatchOfRows(HConnectionManager.java:1238)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:666)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:510)
at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:94)
at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:55)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:498)
at
org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
at BulkUpload$BulkUploadMapper.map(Unknown Source)
at BulkUpload$BulkUploadMapper.map(Unknown Source)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
thanks,
hari