Vijay,

No worries, this thread is fine. The processor will stream the contents of the 
FlowFIle to the Standard Input (StdIn) of the process
that is generated. So it will go to the bash script. The bash script can do 
whatever it needs to do, pipe to another command, etc.
Whatever is written to StdOut becomes the content of the FlowFile. So it would 
be up to you to pipe the output of the first command
to the input of the second. Does that make sense?

Thanks
-Mark



> On Feb 13, 2019, at 3:26 PM, Vijay Chhipa <[email protected]> wrote:
> 
> Mark,
> 
> Thanks for your quick response, 
> When calling bash script that has multiple commands, is there a single flow 
> file generated after all commands are executed (accumulating output from each 
> command) or multiple flow files generated per command line in the bash 
> script. 
> 
> Sorry for tagging along another question on top of this, I can ask it as a 
> separate thread if it makes more sense. 
> 
> Thanks
> 
> 
>> On Feb 13, 2019, at 12:50 PM, Mark Payne <[email protected]> wrote:
>> 
>> Vijay,
>> 
>> This would be treated as arguments to a single command.
>> 
>> One option would be to create a simple bash script that executes the desired 
>> commands and invoke
>> that from the processor. Or, of course, you can chain together multiple 
>> processors.
>> 
>> Thanks
>> -Mark
>> 
>> 
>>> On Feb 13, 2019, at 1:48 PM, Vijay Chhipa <[email protected]> wrote:
>>> 
>>> Hi, 
>>> 
>>> I have a ExecuteStreamCommand  processor running a single command, 
>>> (executing a  -jar <args> ) and it runs fine, 
>>> 
>>> I need to run the same command but with different arguments. 
>>> 
>>> My question is: Can I put multiple lines as command arguments and still 
>>> have a single instance of the ExecuteStreamCommand?
>>> 
>>> Would those be treated as arguments to a single command, or each line of 
>>> arguments would be treated as separate command?
>>> 
>>> 
>>> Thanks 
>>> 
>>> Vijay
>>> 
>>> 
>>> 
>> 
> 

Reply via email to