Hi Mike,

You're right, I don't think you're going to get anywhere with the flowfile
attribute providing the value, as like you said, ListAzureBlobStorage is a
source processor and doesn't accept any input flowfiles.

So currently, you're really looking at two choices, which you've already
identified:
1.  Manually rotate the SAS token through the user interface.
2.  Make an HTTP call to the NiFI API causing an update to the controller
service SAS token property.

Now, #2 is not quite as bad as you think it is. Luckily, NiFi will deal
very nicely with the temporary restart of the backing controller service.
NiFi handles shutting down all the processors that are connected to the
controller service and then bringing them back up (from multiple API
calls). It will queue up the flowfiles for the short time this operation
occurs (but yes, it does need to stop those processors before any changes
can be made). In theory, you shouldn't have any corruption issues going
this path. The problem side of it will be in writing your script to connect
with the API and issuing the request to make this change. A bit painful,
but possible. Your script would need to locate or know how to find the
correct controller service (typically via its uuid).

All this really speaks to a feature request, to be honest. It would be
ideal that the controller service can reconfigure itself by grabbing the
new SAS token for your storage account from Azure. Presumably the best
practice would be to pick up the new SAS token from Key Vault (since Key
Vault can be configured to own SAS Token generation for the storage
account)? I guess Azure AD / OAuth is also a possible (alternative)
solution?

Anyway, short term, consider the API approach. Long term, a contribution to
the project along these lines would probably be gratefully accepted. At
minimum, a Jira ticket to suggest the new feature would be a good starting
place.

/Adam



On Wed, Mar 8, 2023 at 9:53 AM Mike Rutlin <[email protected]> wrote:

>
> I have a general question on how to dynamically update and sync SAS tokens
> being utilized by the AzureBlogStorage processors ( List,Fetch,Delete, etc)
> .
>
> From our NIFI we are accessing a storage container in a different cloud
> and thus cannot use AZURE MI as standard for storage containers in the same
> cloud. Thus we need to use the SAS Tokens attribute / parameter supported
> by the controller service or by the processors.
>
> We pull the SAS token on a periodic basis, before the old SAS token
> expires.
>
> In manual testing of this we set a sensitive Parameter to the SAS token
> and set up the Controller service SAS Token attribute / parameter to use
> the Parameter we created.  Everything worked as it should.
>
> The problem/issue occurs when the SAS token expires and we pull the new
> one. How do I modify the Parameter ? Using the NIFI API seems messy and
> potential to corrupt the flow configuration. along with the fact that it
> stops and starts service / processors with the Parameter we are updating.
>
> I would like to use a flowfile attribute ( which the documentation
> suggests is possible ). However the ListAzureBlobStorage processor is a
> start flow processor ( no inputs ). Thus I can't run a script to get the
> new key and set an attribute that could then be used by the SAS Token
> attribute / parameter in the processor.
>
> Any ideas on how I can do this? It seems this would be a common practice
> to roll keys and then inject the new key into the data flow for either AWS
> or Azure.
>
> If the ListAzureBlobStoreage had an input I coud update with new keys OR
> if the list processor had a failure output I could update the key upon
> expiration.
>
> Thanks for any and all guidance.
>
> Mike R
>
>

Reply via email to