Hi,
It seems that Airflow handles bellow situation currently:
- DAGs discovered in scheduler, but not discovered by webserver yet
- DAGs discovered in webserver, but not discovered by scheduler yet
I still don't quite understand why there is the discovering logic separately in
scheduler
Yes, after looking at the code, it looks like implementing a many to many
between pools and tasks requires significant redesign.
My workaround is to implement it externally by borrowing required resources
from an external pool using a sensor, and returning them back once the task
is done using
A task is assigned to a pool by the task specifying the name of the pool.
The docs suggest that the pool argument is a string, not a list of strings.
https://airflow.apache.org/code.html#baseoperator
And looking at the code it does seem like this relationship of one task
assigned to zero or one
Hi,
When add a new dag, sometimes we can see:
```
This DAG isn't available in the web server's DagBag object. It shows up in this
list because the scheduler marked it as active in the metadata database.
```
In the views.py, it will collect DAGs under "DAGS_FOLDER" by instantiate a
DagBag
Welcome!
On the github side: none of the commiters (currently) have permission to
merge/edit tickets on Githb. Recently-ish Apache made it possible for projects
to use Github as the primary repo, and I called a vote before I stopped for
paternity leave (which I'm coming to the end of now). The
Yes, I want to know the event about the creation of a DagRun.
发件人: crisp...@gmail.com 代表 Chris Palmer
发送时间: 2018年5月11日 15:46
收件人: dev@airflow.incubator.apache.org
主题: Re: 答复: How to know the DAG is starting to run
It's