mik-laj edited a comment on issue #16041:
URL: https://github.com/apache/airflow/issues/16041#issuecomment-847721984


   This file is also missing a few things that make this docker-compose unsafe 
for production:
   - [CPU/memory resource 
limit](https://github.com/compose-spec/compose-spec/blob/002f33d2fa78425cd31f096444280cc4c9daf445/schema/compose-spec.json#L518-L519)-
 Each. the container has access to all system resources
   - SSL - The connection to the container should be encrypted by Traefik / 
other proxy, or we should configure SSL in the webserver ([webserver] 
web_server_ssl_*).
   - Containers use a local file system, but we should use volumes in a 
production environment.
   
   We should also mention the possible ways of deploying DAGs, eg Git Sync.
   
   I. I also recommend the last one. discussions on Slack, where I explained 
the assumptions of this guide.
   
https://apache-airflow.slack.com/archives/CCQ7EGB1P/p1621801810231000?thread_ts=1621711385.211600&cid=CCQ7EGB1P
   
   > This example docker file has been prepared for the most popular 
configuration. I've only tested it with CeleryExecuttor. As for other executor 
configurations, I think that is beyond the scope of this article. The purpose 
of this article was to facilitate the launch of Airflow by someone who is 
unfamiliar with Airflow on its first run, so that they can test and check how 
Airflow works.
   > One way to do this is to limit all the actions that the user takes. Now, 
to start Airflow they just need to run two simple commands:
   > ```
   > curl...
   > docker-compose up
   > ```
   > You don't need to select a database engine, executor, or set other 
configuration options. You don't even need to know what it is to run Airflow.
   > We should prepare separate guides on how to configure Docker-compose in 
other configurations . I even started [working on a 
tool](https://github.com/apache/airflow/issues/8605#issuecomment-759469443) 
that would allow us to generate several Docker-compose filesets based on 
user-supplied options, but I gave up work when Polidea was acquired by 
Snowflake.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to