I would also suggest if you want to set a static flow and not have to update 
data directly in NiFi, put the JSON strings in a file and use GetFile to read 
it in, SplitText to split it into individual lines (flowfiles), and process 
that way.


Andy LoPresto
alopre...@apache.org
alopresto.apa...@gmail.com
PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69

> On Mar 28, 2018, at 2:13 PM, Bryan Bende <bbe...@gmail.com> wrote:
> 
> Since it sounds like each query is a JSON document, can you create a
> JSON array of all your queries and put that as the Custom Text of a
> GenerateFlowFile processor?
> 
> Then follow it with SplitJson to split the array into a query per flow
> file, assuming that is what you want.
> 
> Could also use ExecuteScript and enter all the queries as user defined
> properties, then write a small script that loops of the properties and
> produces a flow file for each dynamic property, where the content is
> the value of the property which would be the query string.
> 
> On Wed, Mar 28, 2018 at 5:03 PM, Mike Thomsen <mikerthom...@gmail.com> wrote:
>> More specifically: I know you can do this functionality by chaining other
>> processors or using a bunch of generateflowfiles. What I'm looking for is a
>> one stop way of associating dozens of small queries to one processor and
>> have it batch send them on its own with no backend dependency like a
>> database.
>> 
>> On Wed, Mar 28, 2018 at 5:02 PM, Mike Thomsen <mikerthom...@gmail.com>
>> wrote:
>> 
>>> I have a client that would benefit from being able to run certain queries
>>> periodically. Like half a dozen or more. Is there any processor where you
>>> can associate a bunch of strings (like JSON) and send them out individually
>>> as flowfiles?
>>> 
>>> Thanks,
>>> 
>>> Mike
>>> 

Attachment: signature.asc
Description: Message signed with OpenPGP using GPGMail

Reply via email to