potiuk commented on issue #52147:
URL: https://github.com/apache/airflow/issues/52147#issuecomment-3230111000

   > Same as https://github.com/apache/airflow/issues/50601 ?
   
   No. completely different issue.
   
   > It seems rather Kafkaesque to me to have to set up another orchestrator 
alongside the Airflow orchestrator to clean up its working base.
   
   Not sure why you'd say it, It's ok to do cleanups, log handling maintenance 
etc. using cron or whatever way you normally maintain your applicattion (for 
example kubernetes jobs etc.).
   
   Airflow orchestration is for data related pipelines. And it's even done by 
different roles and people - and this is part of Airflow Security Model. Dag 
Authoring is done by Dag Authors, Airflow Deployment management is done by 
Deployment Managers. See here 
https://airflow.apache.org/docs/apache-airflow/stable/security/security_model.html.
 And Dag Authors are not responsible and should not have acess to do the same 
things that Deployment Manager does. 
   
   This is not "kafkesque" thing, this is security feature.
   
   And if you want to bypass that because you fill like it - feel free. You can 
for example configure your deployment in the way that your workers make an ssh 
connection to airflow api-server and execute "airflow db clean" command 
remotely. This is all entirely possible (but of course only if you 
intentionally want to break security perimeters). In this case you could use 
Airflow to schedule it.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to