Hi Andrew/Bryan,
Thank you for your replies. From the initial functional tests it looks like
that upgrading the consume_kafka processor to 1_0 is working but I'll run
the load tests to be sure of that. I'll try implementing the rest of the
ideas and will come back in case of any issues.
Thanks
Thanks for the suggestions, can you clarify if you mean the NiFi Processor
Property "Update Query", or the FlowFile require proper json? I'm not sure
how to get proper json with the $set in there.
I made the following modifications based it:
NiFi Processors Properties:
Update Query:
Two things:
1. You need to use valid JSON. Your query is not a valid JSON example
because some of the values are not quoted.
2. You need to make sure the update option is set to use operators, not use
document.
Let us know if that helps.
Mike
On Thu, Jun 21, 2018 at 3:19 PM Ryan Hendrickson <
Hi Faisal,
Some observations and next steps.
1. Brian has a great point of using the more appropriate consumer that
will leverage the latest API.
2. This still certainly feels like a resource issue -- testing a single
Kafka broker first and not growing the cluster seems odd especially
Hi,
I can't seem to figure out the right combo of parameters to get a
document to update in Mongo using the PutMongo processor and the $set
operator.
Try 1:
The incoming flowfile contains the customId: abc
NiFi Processor Properties:
Mode: update
Upsert: false
Update Query Key: No
Hi Andy,
I am using version 1.6.0 as a single instance on my local MacBook pro.
The configuration of bootstrap-notification-services.xml file is:
http-notification
org.apache.nifi.bootstrap.notification.http.HttpNotificationService
Kelsye,
I know it’s not the best suggestion but if Nifi Expression language could be
used for the fields you could use an updated attribute processor to add the
attributes to the file depending on some other attribute, so that your only
adding the attributes that are needed for that particular
Kelsey
Have you looked at JSONTreeReader?
thanks
joe
On Thu, Jun 21, 2018, 5:00 AM Kelsey RIDER
wrote:
> OK, thanks for the heads-up!
>
>
>
> If I could make another suggestion: could the JSONPathReader be made a
> little more dynamic? Currently you have to specify every single field…
>
>
>
>
OK, thanks for the heads-up!
If I could make another suggestion: could the JSONPathReader be made a little
more dynamic? Currently you have to specify every single field…
In my case (although I doubt I’m alone), I have several different collections
with different schemas. My options are either
Your general assessment about what you'd need is correct. It's a fairly
easy component to build, and I'll throw up a Jira ticket for it. Would
definitely be doable for NiFi 1.8.
Expect the Mongo stuff to go through some real clean up like this in 1.8.
One of the other big changes is I will be
Hello,
I've been experimenting with NiFi and MongoDB. I have a test collection with 1
million documents in it. Each document has the same flat JSON structure with 11
fields.
My NiFi flow exposes a webservice, which allows the user to fetch all the data
in CSV format.
However, 1M documents
11 matches
Mail list logo