Hi Diogo,

This would be valuable for me as well, I'd love first-class support for
hdfs://..., s3://..., gcs://..., etc as a value for dags_folder. As a
workaround, I deploy a maintenance DAG that periodically downloads other
DAGs from GCS into my DAG folder. Not perfect, but gets the job done.
Chris

On Thu, Mar 15, 2018, at 6:32 PM, Diogo Franco wrote:
> Hi all,
> 
> I think that the ability to fill up the DagBag from remote
> locations would> be useful (in my use case, having the dags folder in HDFS 
> would
> greatly> simplify the release process).
> 
> Was there any discussion on this previously? I looked around
> briefly but> couldn't find it.
> 
> Maybe the method **DagBag.collect_dags** in *airflow/models.py *could> 
> delegate the walking part to specific methods based on the
> *dags_folder *prefix,
> in a sort of plugin architecture. This would allow the
> dags_folder to be> defined like hdfs://namenode/user/airflow/dags, or s3://...
> 
> If this makes sense, I'd love to work on it.
> 
> Cheers,
> Diogo Franco

Reply via email to