Hi,
Ok I think I have post to quickly.
So I run more test.
UPSERT document with Operator Enable
test1 : upsert the document {"id1":1,$set{"value":"abc"}}, PutMongo
updateQueryKey=id1 => ok
test2 : upsert the document {"id1":1,"id2":1,$set:{"value":"abc"}},
PutMongo updateQueryKey=id1,id2
Kiran,
What do you have set for the "Maximum number of Bins" property of MergeContent?
Each 'zip bundle' will have all of the FlowFiles added to the same bucket.
So if you have more 'zip bundles' coming in than you have available buckets,
it will evict one of the bins before all of its FlowFiles
Hi Vitaly,
Can you share the configuration of your processor? Can you switch the log
level to debug for this processor? Do you have input relationships on the
processor (from other processor? failure?)?
Thanks
Pierre
2018-07-10 18:11 GMT+02:00 Vitaly Krivoy :
> Has anyone tried to use
Hi Kiran,
In your flow, how do you avoid duplicate files going into MergeContent?
For example:
1. file1.zip goes into Unpack zip file, it contains 5 files.
2. These 5 files are sent down both success paths (AttributeToJSON and
increment fragment index and count)
3. 5 files show up at
Mark,
Thank you that was the issue and it's all working fine now.
Thanks
‐‐‐ Original Message ‐‐‐
On 11 July 2018 3:32 PM, Mark Payne wrote:
> Kiran,
>
> What do you have set for the "Maximum number of Bins" property of
> MergeContent?
> Each 'zip bundle' will have all of the