I'm using Ansible to deploy the Airflow, the steps are: - First install Airflow using pip (or a rc using curl) - Do an `airflow version` to trigger the creation of the default config - Set the config correctly variables in the config using Ansible. - Deploy the supervisord files - Start everything
A separate role is there to deploy Postgres. But if you are working on a cloud environment, you can also get Postgres/MySQL as a service. Hope this helps. Cheers, Fokko 2017-11-15 3:19 GMT+01:00 Marc Bollinger <[email protected]>: > Samson <https://github.com/zendesk/samson> deploy that runs a script > running a Broadside <https://github.com/lumoslabs/broadside> deploy for > ECS, which bounces the Web and Scheduler workers, and updates the DAG > directory on the workers. Docker images come from a Github -> Travis -> > Quay > <https://quay.io/> CI setup. > > On Tue, Nov 14, 2017 at 10:18 AM, Alek Storm <[email protected]> wrote: > > > Our TeamCity server detects the master branch has changed, then packages > up > > the repo containing our DAGs as an artifact. We then use SaltStack to > > trigger a bash script on the targeted servers that downloads the > artifact, > > moves the files to the right place, and restarts the scheduler (on the > > master). > > > > This allows us to easily revert changes by redeploying a particular > > TeamCity artifact, without touching the git history. > > > > Alek > > > > On Nov 14, 2017 11:02 AM, "Andy Hadjigeorgiou" <[email protected]> > > wrote: > > > > > Hey, > > > > > > Was just wondering what tools & services everyone uses to deploy new > > > versions of their data pipelines (understandably this would vary > greatly > > > based on tech stack) but I'd love to hear what the community has been > > > using. > > > > > > - Andy > > > > > >
