Right now the location of the dags folder needs to be identical on both the 
scheduler and all the workers. Sorry.

This is a bug, and it shouldn't need to be the case. (Apache Jira is having a 
hiccough right now so I can't find the issue at the moment.)
-ash
On May 7 2020, at 8:53 am, devang pandey <[email protected]> wrote:
> Hi All,
>
> I am pretty new to Airflow and we are planning to use it at organisation 
> level for automations (mainly data-extraction pipelines).
>
> I have setup an airflow cluster with one server and 2 VDI machines. Using 
> Airflow 1.10.9 , postgres , celery 4.4.0 and redis as message broker.
>
> Thing is I am successfully able to run example dags on my cluster but when 
> trying to run my own dag - its gets scheduled and picked up by worker machine 
> but then fails with an error message "DAG ID could not found".
>
> On taking a closer look I observed my worker machine while picking up the 
> task is passing a - sd parameter which is pointing to dag folder of my 
> scheduler machine / server .
>
> Things I can confirm:
>
> 1- DAG is present on all machines , master and worker machines inside 
> AIRFLOW_HOME/dags
> 2- Airflow worker config file points to local correct dag location.
>
> I tried multiple things but could not understand the issue. Please if you 
> provide suggestions.
>
>
> Thank you,
> Devang
>

Reply via email to