Hi,

Thanks for your quick reply.Yeah i am using executestreamcommand to execute
bellow script

zk-migrator.sh -s -z
destinationHostname:destinationClientPort/destinationRootPath/components
-f /path/to/export/zk-source-data.json

Can the zk-source-data.json file written as a output flow file of the above
processor.if yes please let me know how.

Many thanks
sanjeet

On Fri, 10 Apr 2020, 9:25 pm Bryan Bende, <bbe...@gmail.com> wrote:

> Hello,
>
> Assuming you are using the ExecuteStreamCommand processors, then the
> output of the command is written to the flow file content. So if your
> command writes the JSON to stdout, then it should end up in the flow file.
>
> Thanks,
>
> Bryan
>
>
> On Fri, Apr 10, 2020 at 11:23 AM sanjeet rath <rath.sanj...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I have a scenario , where i have to Execute a shell script amd the output
>> of the script is a json file and i want to put the file in s3 bucket.
>>
>> I am able to do it by building 2 flows.
>>
>> One flow for executestremproces and store the json file in folder in
>> System
>>
>> Then another flow to get the file from system and using puts3object to
>> put in s3 bucket. But building write and read in separate flow is bit risky
>> and noway i am make it dependent as getfile has no preprocessor.
>>
>>
>> Is it possible not to store the json file (output of shell script) in
>> file system and as a flow file move it to s3 bucket ?
>> Means in one flow to execute shellscript and store output in s3 bucket.
>>
>> Regards,
>> Sanjeet
>>
>>

Reply via email to