Hi all,

I think that the ability to fill up the DagBag from remote locations would
be useful (in my use case, having the dags folder in HDFS would greatly
simplify the release process).

Was there any discussion on this previously? I looked around briefly but
couldn't find it.

Maybe the method *DagBag.collect_dags* in *airflow/models.py *could
delegate the walking part to specific methods based on the
*dags_folder *prefix,
in a sort of plugin architecture. This would allow the dags_folder to be
defined like hdfs://namenode/user/airflow/dags, or s3://...

If this makes sense, I'd love to work on it.

Cheers,
Diogo Franco

Reply via email to