Hi All, I am pretty new to Airflow and we are planning to use it at organisation level for automations (mainly data-extraction pipelines).
I have setup an airflow cluster with one server and 2 VDI machines. Using Airflow 1.10.9 , postgres , celery 4.4.0 and redis as message broker. Thing is I am successfully able to run example dags on my cluster but when trying to run my own dag - its gets scheduled and picked up by worker machine but then fails with an error message "DAG ID could not found". On taking a closer look I observed my worker machine while picking up the task is passing a - sd parameter which is pointing to dag folder of my scheduler machine / server . *Things I can confirm:* *1- DAG is present on all machines , master and worker machines inside AIRFLOW_HOME/dags* *2- Airflow worker config file points to local correct dag location.* I tried multiple things but could not understand the issue. Please if you provide suggestions. Thank you, Devang
