Thanks for the response! Perhaps it will be easier if I explain my use-case, and you can tell me if I'm missing an obvious, easier way to do what I'm trying to do.
We are building an infrastructure-as-a-service platform where users can kick off a workflow for themselves and in their request, specify the schedule_interval and start_date. The majority of the workflow is the same for any user request, with only some config parameters and the schedule differing for each user. However, my understanding is that the "unit of scheduling" in Airflow is a DAG. This means in order to leverage Airflow's scheduling functionality, each user's request needs to be represented by its own DAG, each with the specified schedule_interval and start_date. One way to do this is to make a DAG template file, populate it with the user request data, and write the resulting .py file to the DAG_FOLDER. I was just wondering if there's a way to do this directly in the running Airflow scheduler process itself; that is, directly inject a DAG definition into the scheduler without writing a physical .py file to disk. Alternatively, if not, is it possible to have multiple schedules for a single DAG (in which case, we would not need to have a DAG per user request)? Thanks, -Saleil From: [email protected] At: 05/08/20 22:28:31To: Saleil Bhat (BLOOMBERG/ 919 3RD A ) , [email protected] Subject: Re: Dynamically adding DAGs to Airflow Airflow will continue to periodically look for new dags when running --- whether dynamic or otherwise. Does your dag show up when you do airflow list_dags? Then it will show up in webserver sooner or later. If it does not, then it's likely something is wrong with your dag file. There has been talk of changing airflow's behavior of automatically parsing every dag over and over. This could reduce unnecessary processing and make "expensive" dynamic dags feasible, but I don't think this has been implemented yet. On Fri, May 8, 2020 at 3:55 PM Saleil Bhat (BLOOMBERG/ 919 3RD A) <[email protected]> wrote: Hey all, I'm new to Airflow, and I have a question concerning creating DAGs on the fly. I saw this snippet in the documentation: https://airflow.apache.org/docs/stable/faq.html#how-can-i-create-dags-dynamically which suggests you can programmatically create DAGs. My question is, can I invoke code similar to this to create a new DAG when Airflow is already running? For example, suppose I have a DAG factory which takes some config parameters and constructs a DAG. Would it be possible to use the CLI and/or REST API to trigger a call to this DAG factory to add a new DAG to my Airflow system? Thanks, -Saleil
