Hi,
I'd like to perform a put but have it succeed only if certain conditions
based on values of some columns are met.
I am on 0.94.6
my first option was:
write a prePut() Observer, perform the checks in there and then call a
bypass() if needed.
However, calling bypass() seems to have no effect. I thought that bypass()
would cause HBase to not perform the put.
Is that not correct?
second option:
write a preCheckAndPut() and perform the validations there. problem is that
I have been running the process via a map/reduce and using
TableMapReduceUtil.initTableReducerJob(
TABLE_NAME,
null,
job);
And there seems to be no way to tell this to use "checkAndPut" instead of
"put", is there?
third option:
I could try to do my own htable.checkAndPut from the mapper. However, since
I am on a kerberized cluster, I am not able to get an htable in the setup()
method of the mapper.
GSS initiate failed [Caused by GSSException: No valid credentials provided
(Mechanism level: Failed to find any Kerberos tgt)]
trying to perform UserGroupInformation.loginUserFromKeytab() fails too;
I get Caused by: javax.security.auth.login.LoginException: Unable to
obtain password from user
So I am not able to do any htable call from the mapper.
The only M/R that has worked for me to read from HDFS text file as source
and write to HBase sink is TableMapReduceUtil.initTableReducerJob() with
null for reducer class.
Any thoughts?
Thank you much in advance
ameet