Osamaelgendy opened a new issue, #27475:
URL: https://github.com/apache/airflow/issues/27475

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   I used puckel/docker-airflow latest image to build airflow web server in 
docker-compose file with other services.
   The problem is when it starts the dagfile.py, this error pops up
   I read that **Importing operators, sensors, hooks added in plugins via 
airflow.{operators,sensors,hooks}.<plugin_name> is no longer supported, and 
these extensions should just be imported as regular python modules**
   
   but I don't know how to do this exactly, I already installed some python 
packages in the entrypoint.sh as it's written in the original repo
   
   but how to install this?
   
   ### What you think should happen instead
   
   I think it should be installed like any other python packages in the 
requirement.txt file, but i don't know how
   
   ### How to reproduce
   
   _No response_
   
   ### Operating System
   
   Windows
   
   ### Versions of Apache Airflow Providers
   
   puckel/docker-airflow:latest
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   **Docker-compose file**
   version: "3.7"
   
   services:
     database:
       image: mysql:8.0
       platform: linux/amd64
       extra_hosts:
         - "somehost:162.242.195.82"
       command:
         - "--default-authentication-plugin=mysql_native_password"
       environment:
         - MYSQL_RANDOM_ROOT_PASSWORD=yes
         - MYSQL_DATABASE=codetest
         - MYSQL_USER=codetest
         - MYSQL_PASSWORD=swordfish
         - MYSQL_TIMEOUT = 300000
       ports:
         - "3306:3306"
         
       restart: always
   
       
   
     clickhouse:
       image: "yandex/clickhouse-server"
       environment:
         - CLICKHOUSE_USER=codetest
         - CLICKHOUSE_PASSWORD=swordfish
         - CLICKHOUSE_DATABASE=codetest
       ulimits:
         nofile:
           soft: 262144
           hard: 262144
       ports:
         - "9999:9000" # native interface
         - "8888:8123" # http interface
   
     postgres:
           image: postgres:9.6
           environment:
               - POSTGRES_USER=airflow
               - POSTGRES_PASSWORD=airflow
               - POSTGRES_DB=airflow
           logging:
               options:
                   max-size: 10m
                   max-file: "3"
   
     webserver:
           image: puckel/docker-airflow:latest
           #restart: always
           depends_on:
               - postgres
           environment:
               - INSTALL_MYSQL=y
               - LOAD_EX=n
               - EXECUTOR=Local
           logging:
               options:
                   max-size: 40m
                   max-file: "5"
           volumes:
               - ./dags:/usr/local/airflow/dags
               - ./data:/usr/local/airflow/data
               - ./logs:/usr/local/airflow/logs
               - ./plugins:/usr/local/airflow/plugins
               - ./requirements.txt:/usr/local/airflow/requirements.txt
               - ./config/airflow.cfg:/opt/airflow/airflow.cfg
               - ./docker-entrypoint.sh:/usr/local/airflow/docker-entrypoint.sh
           entrypoint:
               ./docker-entrypoint.sh
               
           ports:
               - "8080:8080"
           command: webserver
           healthcheck:
               test: ["CMD-SHELL", "[ -f 
/usr/local/airflow/airflow-webserver.pid ]"]
               interval: 30s
               timeout: 30s
               retries: 3
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to