JavierLTPromofarma edited a comment on issue #8605:
URL: https://github.com/apache/airflow/issues/8605#issuecomment-685451441


   Hello. I made kind of a mix of the examples here to make my own set of 
docker files. I ended up with docker-compose, Dockerfile and Makefile. Using 
the docker-compose and Makefile from this post as a starting point, I have 
already solved some of the problems we encountered as we adapted it to our 
needs, but as a Docker and Airflow noob, I would have liked if these needs had 
already been addressed by an agreed-upon best-practice solution, so I'll 
mention them just in case you can include them in the future file (or mention 
how to address them in a tutorial or something):
   
   - Adding a json file of variables
   - Adding the aws_credentials and aws_config files
   - Modifying files from the core code of airflow. Probably this is not a good 
idea, but I have needed it in three cases:
       1. A plugin hook (number one) that relies on another plugin hook (number 
two), so the first one doesn't detect the second one. I modified the 
__init__.py in hooks folder.
       2. Instead of creating a new folder as library for a modified secrets 
backend, I just modified the original file in 
airflow/contrib/secrets/aws_secrets_manager.py
       3. A modification of an operator that is just one line and I don't want 
to recreate it as plugin
   - Installing python packages from a requirements.txt file 
   - Adding a utilities package
   - After a few weeks, we had the error that docker had reached the maximum 
depth or something like that (I don't recall it exactly)
   
   Regarding the docker-compose, I would like to see an explanation of why 
having separately the webserver and the scheduler, how it works... For 
instance, I don't understand if in some cases a command could be added to just 
one of the containers
   
   The code I have currently is:
   **Dockerfile**
   ```
   FROM apache/airflow:1.10.10
   
   COPY plugins/aws_secrets_manager_backend.py 
/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/secrets/aws_secrets_manager.py
   COPY plugins/aws_secrets_manager_hook.py 
/home/airflow/.local/lib/python3.6/site-packages/airflow/hooks/aws_secrets_manager_hook.py
   
   COPY hooks_init.py 
/home/airflow/.local/lib/python3.6/site-packages/airflow/hooks/__init__.py
   COPY aws_config /home/airflow/.aws/config
   COPY aws_credentials /home/airflow/.aws/credentials
   COPY requirements.txt requirements.txt
   
   RUN pip3 install -r requirements.txt --user
   ```
   **docker-compose**
   ```
   version: "3.7"
   x-airflow-environment: &airflow-environment
     AIRFLOW__CORE__EXECUTOR: LocalExecutor
     AIRFLOW__CORE__LOAD_EXAMPLES: "False"
     AIRFLOW__CORE__SQL_ALCHEMY_CONN: 
postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
     AIRFLOW__CORE__FERNET_KEY: FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
     AIRFLOW__WEBSERVER__DAG_DEFAULT_VIEW: graph
     AIRFLOW__SECRETS__BACKEND: 
airflow.contrib.secrets.aws_secrets_manager.SecretsManagerBackend
     AIRFLOW__OPERATORS__DEFAULT_RAM: 2048
   
   services:
     postgres:
       image: postgres:11.5
       environment:
         POSTGRES_USER: airflow
         POSTGRES_DB: airflow
         POSTGRES_PASSWORD: airflow
     init:
       build: .
       environment:
         <<: *airflow-environment
       depends_on:
         - postgres
       volumes:
         - ./dags:/opt/airflow/dags
         - ./plugins:/opt/airflow/plugins
         - ./logs:/opt/airflow/logs
       entrypoint: /bin/bash
       command: >
         -c "airflow list_users || (airflow initdb
         && airflow create_user --role Admin --username airflow --password 
airflow -e [email protected] -f airflow -l airflow)"
       restart: on-failure
     webserver:
       build: .
       ports:
         - 8080:8080
       environment:
         <<: *airflow-environment
       depends_on:
         - init
       volumes:
         - ./dags:/opt/airflow/dags
         - ./plugins:/opt/airflow/plugins
         - ./variables_secret.json:/opt/airflow/variables_secret.json
         - ./logs:/opt/airflow/logs
         - ./utilities:/opt/airflow/utilities
       entrypoint: /bin/bash
       command: -c "airflow variables -i /opt/airflow/variables_secret.json && 
airflow webserver"
       restart: always
     scheduler:
       build: .
       environment:
         <<: *airflow-environment
       depends_on:
         - webserver
       volumes:
         - ./dags:/opt/airflow/dags
         - ./plugins:/opt/airflow/plugins
         - ./variables_secret.json:/opt/airflow/variables_secret.json
         - ./logs:/opt/airflow/logs
         - ./utilities:/opt/airflow/utilities
       entrypoint: /bin/bash
       command: -c "airflow variables -i /opt/airflow/variables_secret.json && 
airflow scheduler"
       restart: always
   ```
   **Makefile**
   ```
   .PHONY: run stop rm
   
   run:
        docker-compose -f docker-compose.yml up -d --remove-orphans --build 
--force-recreate
        @echo "Airflow running on http://localhost:8080";
   
   stop:
        docker-compose -f docker-compose.yml stop
   
   rm: stop
        docker-compose -f docker-compose.yml rm
   ```
   Kind regards


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to