> Hi Team,
>
>
>
> Using flume interceptor , I am reading messages from kafka with key and
value pair. The key is represented by an integer variable called pk in
below code and the value of message is represented by obj variable with
Ctrl A delimiter.
>
>
>
> For inserting into hive table am doing this:-
>
> MutableRecord mutableRecordInput = new
MutableRecord(pk,obj.split("\u0001"));
>
> MutableRecord mutableRecord = (MutableRecord)
bucketIdResolver.attachBucketIdToRecord(mutableRecordInput);
>
> coordinator.insert(Collections.<String> emptyList(), mutableRecord);
>
>
>
>
>
> For updating the records in hive table with pk being key am doing this:-
>
> RecordIdentifier rd = new RecordIdentifier();
>
> MutableRecord mutableRecordInputup = new
MutableRecord(pk,rd,obj.split("\u0001"));
>
> MutableRecord mutableRecordInputu = (MutableRecord)
bucketIdResolver.attachBucketIdToRecord(mutableRecordInputup);
>
> coordinator.update(Collections.<String> emptyList(), mutableRecordInputu);
>
>
>
>
>
> The inserts work fine. However updates are causing data inconsistency and
instead of updating, it is actually inserting for the first time and
subsequent updates are updating this inserted row and hence is not working
as expected. Please let me know if I am doing the updates in the right way.
>
>  I am using flume 1.6 version and hive 2.1 version provided by mapr
>
> Thanks and Regards,
>
> Manjunath Anand

Reply via email to