dobixu opened a new issue #19534:
URL: https://github.com/apache/airflow/issues/19534


   ### Apache Airflow version
   
   2.2.1 (latest released)
   
   ### Operating System
   
   Centos8.3.2011
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-celery==2.1.0
   apache-airflow-providers-docker==2.2.0
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-grpc==2.0.1
   apache-airflow-providers-http==2.0.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-mysql==2.1.1
   apache-airflow-providers-postgres==2.3.0
   apache-airflow-providers-redis==2.0.1
   apache-airflow-providers-sqlite==2.0.1
   
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   # Dockerfile
   ```bash FROM centos:latest
   RUN dnf install -y wget python38 python38-devel postgresql-devel mysql-devel 
gcc gcc-c++ sqlite
   
   RUN mkdir -p ~/.pip
   COPY pip.conf ~/.pip/ 
   COPY get-pip.py /home/manyun/ 
   
   ARG PIP_VERSION=21.1.1
   ARG AIRFLOW_VERSION=2.2.1
   ARG PYTHON_VERSION=3.8
   ARG 
CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt";
   
   RUN python3 get-pip.py --disable-pip-version-check  --no-cache-dir  
"pip==${PIP_VERSION}" && \
       ln -s /usr/bin/python3 /usr/local/bin/python && pip config set 
global.index-url https://mirrors.aliyun.com/pypi/simple/ && \
       pip install connexion && ln -s /usr/bin/gunicorn /usr/local/bin/gunicorn 
&& ln -s /usr/bin/celery /usr/local/bin/celery && \ 
       pip install 
"apache-airflow[async,celery,grpc,http,docker,mysql,postgres,redis]==${AIRFLOW_VERSION}"
 --constraint "${CONSTRAINT_URL}"
   
   WORKDIR /data/airflow/
   ENTRYPOINT ["airflow"]
   
   ```
   # Docker-compose.yaml
   
   ```bash
   version: '3.8'
   x-airflow-common:
     &airflow-common
     image: registry.manyun-inc.com/services/airflow:1.0
     environment:
       &airflow-common-env
       TZ: 'Asia/Shanghai'
       AIRFLOW_HOME: '/data/airflow'
       #  env_file:
       #- mysql.env
     volumes:
       - /etc/localtime:/etc/localtime:ro
       - /etc/timezone:/etc/timezone:ro
       - /var/run/docker.sock:/var/run/docker.sock
       - ./dags:/data/airflow/dags
       - ./logs:/data/airflow/logs
         #- ./d_task:/data/airflow/d-task
       - ./plugins:/data/airflow/plugins
       - ./airflow.cfg:/data/airflow/airflow.cfg
   services:
     airflow-webserver:
       <<: *airflow-common
       container_name: airflow.webserver
       command: webserver
       ports:
         - 18080:8080
       healthcheck:
         test: ["CMD", "curl", "--fail", "http://localhost:8080/health";]
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
     airflow-scheduler1:
       <<: *airflow-common
       container_name: airflow.scheduler_1
       command: scheduler
       healthcheck:
         test: "ps axu | grep -vE 'grep|restart' | grep 'scheduler'"
         #test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob 
--hostname "$${HOSTNAME}"']
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
   
     airflow-worker:
       <<: *airflow-common
       container_name: airflow.worker
       command: celery worker
       healthcheck:
         test:
           - "CMD-SHELL"
           - 'celery --app airflow.executors.celery_executor.app inspect ping 
-d "celery@$${HOSTNAME}"'
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
     airflow-init:
       <<: *airflow-common
       container_name: airflow.init
       command: db init
       environment:
         <<: *airflow-common-env
     flower:
       <<: *airflow-common
       container_name: airflow.flower
       command: celery flower
       ports:
         - 5555:5555
       healthcheck:
         test: ["CMD", "curl", "--fail", "http://localhost:5555/";]
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
   ```
   
   ### What happened
   
   When I run the dags , some tasks in dags  will be marked as queued   
   
   ### What you expected to happen
   
   Airflow  was used in 3 independent environments 。just 1 environment exist  
this appearance.
   
   I  tried to  clear data and rebuild  postgresql 、redis、airflow  .And i 
upgrade airflow 2.2.0 to  2.2.1 。
   Finalll  I  rebuild  all  in a new free environment .The appearance still 
exits.
   
   
   ### How to reproduce
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to