throwawayaccount11111 opened a new issue, #30611:
URL: https://github.com/apache/airflow/issues/30611

   ### Apache Airflow Provider(s)
   
   hashicorp
   
   ### Versions of Apache Airflow Providers
   
   3.3.0
   
   ### Apache Airflow version
   
   2.5.3
   
   ### Operating System
   
   linux
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   We use Vault as the secrets backend for Airflow. And we have noticed quite 
an abnormal pattern where Airflow asks for 400 tokens every 10 minutes. This is 
when all the DAGs are disabled/paused. When the DAGs, totaling 300ish, are 
enabled and running, the number of tokens spikes to 40K. The TTL on the tokens 
is 15 minutes. Most of our DAGs finish within 10 minutes. 
   
   We initially thought it was due to a typical pattern in the code related to 
how DAGs access secrets. But upon further troubleshooting, this happens during 
the auth, which is probably when the pod comes up. (We use Kubernetes Executor).
   
   ### What you think should happen instead
   
   The token should be cached and Airflow should not be requesting a new token 
for every DAG execution. Even without any DAG executing, Airflow is asking for 
tokens.
   
   ### How to reproduce
   
   Start Airflow with Kubernetes Executor and Vault as the secrets backend. 
Observe the number of tokens/leases in Vault. 
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to