Hi Mark, thanks for your feedback.
We will assemble context detail, and will share here asap.

We using python script + nipyapi lib implementing a custom way to execute
nifi flows:
1. test in beginning if its running?
2. set pg var jobid
3. set others pg vars
4. execute pg
5. loop
    get number threads + ff on queues
    if error then exit error
    if threads=0 and ffs=0 then exit OK

Next week going myself talk to coleaugue more aware of our py scripy to
understand exact sequence of nifi rest api calls.. hopefully its something
in our script.. but lets see.

Thanks,
Emanuel O.

On Thu 27 Feb 2020, 13:08 Mark Payne, <[email protected]> wrote:

> Emanuel,
>
> It sounds like a bug. When a variable is set, it should stop any
> processors currently referencing the variables, add the variable, and then
> restart the affected processors.
>
> Can you include the full stack trace? What version of NiFi? Are you
> changing the value of an existing variable or adding a new variable to the
> Process Group? A template of the flow would be helpful also, if that is
> something that you can provide.
>
> Thanks
> -Mark
>
> On Feb 26, 2020, at 6:14 AM, Oliveira, Emanuel <[email protected]>
> wrote:
>
> Hi all,
>
>
> We using shell script/python using nipyapi (client for NIFI REST API) to
> set:
> PG variable:
> JOBID=<control-m ordered>
>
> But we would better renaming things so our LogMessages prefix we use
> ${job_jobid} instead of PG var ${JOBID}:
> JOBID:${job_jobid}||
>
> Today for our surprise when we changed in python script to set a new PG
> var:
> job_jobid
>
> we got error:
> Cannot update variable 'job_jobid' because it is referenced by LogMessage
>
>
> Is this known/expected behaviour ? Im surprised enough to think this
> doesn’t make sense..
>
> As workaround we could:
> *PG variables to be set externally – inputs for th eflow:*
> FLOW_JOBID
> FLOW_BEGIN_DT
> (xx)
>
> *Add UpdateAttribute processor to create attributes as copies of the
> respective PG vars:*
> job_jobid= ${FLOW_JOBID}
> job_begin_dt= ${FLOW_BEGIN_DT}
>
>
>
> Thanks//Regards,
> *Emanuel Oliveira*
> Senior Data Engineer
>
>
>

Reply via email to