I have a general question on how to dynamically update and sync SAS tokens
being utilized by the AzureBlogStorage processors ( List,Fetch,Delete, etc)
.

>From our NIFI we are accessing a storage container in a different cloud and
thus cannot use AZURE MI as standard for storage containers in the same
cloud. Thus we need to use the SAS Tokens attribute / parameter supported
by the controller service or by the processors.

We pull the SAS token on a periodic basis, before the old SAS token
expires.

In manual testing of this we set a sensitive Parameter to the SAS token and
set up the Controller service SAS Token attribute / parameter to use the
Parameter we created.  Everything worked as it should.

The problem/issue occurs when the SAS token expires and we pull the new
one. How do I modify the Parameter ? Using the NIFI API seems messy and
potential to corrupt the flow configuration. along with the fact that it
stops and starts service / processors with the Parameter we are updating.

I would like to use a flowfile attribute ( which the documentation suggests
is possible ). However the ListAzureBlobStorage processor is a start flow
processor ( no inputs ). Thus I can't run a script to get the new key and
set an attribute that could then be used by the SAS Token attribute /
parameter in the processor.

Any ideas on how I can do this? It seems this would be a common practice to
roll keys and then inject the new key into the data flow for either AWS or
Azure.

If the ListAzureBlobStoreage had an input I coud update with new keys OR if
the list processor had a failure output I could update the key upon
expiration.

Thanks for any and all guidance.

Mike R

Reply via email to