CarlosDomingues opened a new issue, #24740:
URL: https://github.com/apache/airflow/issues/24740

   ### Description
   
   Since Airflow now has stable a [REST 
API](https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html),
 it would be great if we had an endpoint to upload files to DAG_FOLDER.
   
   ### Use case/motivation
   
   DAGs are loaded from files and directories stored in `DAG_FOLDER`, which is 
a filesystem path that must be shared among several Airflow components.
   
   I've explored a few options to populate and update `DAG_FOLDER`over the 
years:
   
   1. Use `git sync` + Cron to pool DAGs from Git repos. That does not scale 
operationally and requires Airflow servers to access repos where source lives, 
which is bad for security.
   
   2. Use a middleman like AWS S3 and upload DAGs to it during a CD pipeline. 
Then, Airflow can periodically pull from that. IMO that's better than (1), but 
still not ideal.
   
   3. Use a shared mount (like AWS EFS or K8s volumes) between Airflow and CD 
servers, so DAGs can be copied to `DAG_FOLDER`. That's cool, but it means 
tigher coupling between Airflow and CD infrastructure.
   
   I think it would be operationally  great if I could upload DAGs using a 
simple `curl` command. That would make interaction between Airflow and CI/CD 
pipelines much more flexible, as well as enabling finer graned security for 
pipelines.
   
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to