can you check whether your hbase is stable or not (you can use hbck tool to see any inconsistencies).
On Sat, Aug 27, 2016 at 10:41 PM, Sanooj Padmakumar <[email protected]> wrote: > Hi All, > > I am getting the same exception , this time when running a Phoenix MR ( > https://phoenix.apache.org/phoenix_mr.html) .. The MR works just fine if > I am doing a select with limit to some 10 rows.. but when I do the same > with a lot of data, I start getting the below exception after 66% of > reducer.. > > 6/08/27 10:04:35 INFO mapreduce.Job: Task Id : > attempt_1471862728027_0103_r_000036_0, Status : FAILED > Error: java.lang.RuntimeException: org.apache.phoenix.execute.CommitException: > org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: > Failed 2000 actions: IOException: 2000 times, > at org.apache.phoenix.mapreduce.PhoenixRecordWriter.close( > PhoenixRecordWriter.java:62) > at org.apache.hadoop.mapred.ReduceTask$ > NewTrackingRecordWriter.close(ReduceTask.java:550) > ............ > Caused by: > org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: > Failed 2000 actions: IOException: 2000 times, > at org.apache.hadoop.hbase.client.AsyncProcess$ > BatchErrors.makeException(AsyncProcess.java:227) > at org.apache.hadoop.hbase.client.AsyncProcess$ > BatchErrors.access$1700(AsyncProcess.java:207) > at org.apache.hadoop.hbase.client.AsyncProcess$ > AsyncRequestFutureImpl.getErrors(AsyncProcess.java:1568) > at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:1003) > at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:1017) > at org.apache.phoenix.execute.MutationState.commit( > MutationState.java:444) > ... 13 more > > When I look at the failed reducers log I am seeing a lot of logs like > these.. > > 2016-08-27 10:07:05,834 ERROR [main] > org.apache.hadoop.hbase.client.AsyncProcess: Cannot get replica 0 location > for > {"totalColumns":25,"families":{"p":[{"timestamp":1472317624237,"tag":[],"qualifier" > > But as I said with very less number of rows in the select query, the MR works > just fine and data is populated alright ? Any Phoenix/Hbase parameters > > that is should look into ? > > Thanks > > Sanooj Padmakumar > > > > > > On Wed, Aug 24, 2016 at 11:26 AM, Sanooj Padmakumar <[email protected]> > wrote: > >> Hi All, >> >> I get this error one one of the nodes where we have an application >> running. It comes only after a certain duration and once the application is >> restarted things will start working normally..Any inputs as to why this >> might be happening will be of great help. >> >> >> org.springframework.jdbc.UncategorizedSQLException: >> PreparedStatementCallback; uncategorized SQLException for SQL [<<query here >> >>> ]; SQL state [null]; error code [0]; org.apache.hadoop.hbase.client >> .RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 >> time, ; nested exception is org.apache.phoenix.execute.CommitException: >> org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: >> Failed 1 action: IOException: 1 time, >> at org.springframework.jdbc.support.AbstractFallbackSQLExceptio >> nTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:84) >> at org.springframework.jdbc.support.AbstractFallbackSQLExceptio >> nTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81) >> at org.springframework.jdbc.support.AbstractFallbackSQLExceptio >> nTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81) >> at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTempl >> ate.java:660) >> at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTempla >> te.java:909) >> at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTempla >> te.java:933) >> at org.springframework.jdbc.core.namedparam.NamedParameterJdbcT >> emplate.update(NamedParameterJdbcTemplate.java:313) >> at Caused by: org.apache.phoenix.execute.CommitException: >> org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: >> Failed 1 action: IOException: 1 time, >> at org.apache.phoenix.execute.MutationState.commit(MutationStat >> e.java:473) >> at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConn >> ection.java:472) >> at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConn >> ection.java:469) >> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) >> at org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConn >> ection.java:469) >> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState >> ment.java:323) >> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState >> ment.java:312) >> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) >> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(Pho >> enixStatement.java:310) >> at org.apache.phoenix.jdbc.PhoenixPreparedStatement.executeUpda >> te(PhoenixPreparedStatement.java:200) >> at org.apache.commons.dbcp.DelegatingPreparedStatement.executeU >> pdate(DelegatingPreparedStatement.java:105) >> at org.apache.commons.dbcp.DelegatingPreparedStatement.executeU >> pdate(DelegatingPreparedStatement.java:105) >> at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedSta >> tement(JdbcTemplate.java:916) >> at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedSta >> tement(JdbcTemplate.java:909) >> at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTempl >> ate.java:644) >> ... 11 more >> Caused by: >> org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: >> Failed 1 action: IOException: 1 time, >> at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors. >> makeException(AsyncProcess.java:227) >> at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors. >> access$1700(AsyncProcess.java:207) >> at org.apache.hadoop.hbase.client.AsyncProcess$AsyncRequestFutu >> reImpl.getErrors(AsyncProcess.java:1568) >> at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:1003) >> at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:1017) >> at org.apache.phoenix.execute.MutationState.commit(MutationStat >> e.java:444) >> ... 25 more >> >> >> -- >> Thanks, >> Sanooj Padmakumar >> >> > > > -- > Thanks, > Sanooj Padmakumar >
