Hi Sanooj, I believe that this is related to the issue described in PHOENIX-976 [1]. In that case, it's not strictly related to Kerberos, but instead to file permissions (could it be that your dev environment also doesn't have file permissions turned on?)
If you look at the comments on that jira ticket, there are a couple of things that you could try doing to resolve this (running the import job as the hbase user, or using custom file permissions, or using an alternate incremental load coprocessor). - Gabriel 1. https://issues.apache.org/jira/browse/PHOENIX-976 On Tue, Nov 17, 2015 at 7:14 PM, Sanooj Padmakumar <[email protected]> wrote: > Hello - > > I am using the bulkload of Phoenix on a cluster secured with Kerberos. The > mapper runs fine, reducer runs fine .. and then the counters are printed > fine.. finally the LoadIncrementalHFiles steps fails.. A portion of the log > is given below.. > > > 15/11/17 09:44:48 INFO mapreduce.LoadIncrementalHFiles: Trying to load > hfile=hdfs://..........<<masked>>> > 15/11/17 09:45:56 INFO client.RpcRetryingCaller: Call exception, tries=10, > retries=35, started=68220 ms ago, cancelled=false, msg=row '' on table > 'TABLE1' at region=TABLE1,<<<masked>>>>, seqNum=26 > 15/11/17 09:46:16 INFO client.RpcRetryingCaller: Call exception, tries=11, > retries=35, started=88315 ms ago, cancelled=false, msg=row '' on table > 'TABLE1' at region=TABLE1,<<<masked>>>>, seqNum=26 > > Is there any setting I should make inorder to make the program work on > Kerberos secured environment ? > > Please note , our DEV environment doesnt use Kerberos and things are working > just fine > > -- > Thanks in advance, > Sanooj Padmakumar
