yannibenoit opened a new issue, #27885:
URL: https://github.com/apache/airflow/issues/27885

   ### Apache Airflow version
   
   2.4.3
   
   ### What happened
   
   I recently upgraded airflow to 2.4.3
   
   Some DAGs are not working anymore and i see an error in logs
   
   ```yaml
   [2022-11-17, 03:30:03 CET] {taskinstance.py:1389} INFO - Executing 
<Task(BashOperator): bash_create_and_init_wootric_venv> on 2022-11-16 
02:30:00+00:00
   [2022-11-17, 03:30:03 CET] {base_task_runner.py:141} INFO - Running on host: 
35409bc57fd4
   [2022-11-17, 03:30:03 CET] {base_task_runner.py:142} INFO - Running: 
['airflow', 'tasks', 'run', 'singer_wootric_stitch', 
'bash_create_and_init_wootric_venv', 'scheduled__2022-11-16T02:30:00+00:00', 
'--job-id', '4163', '--raw', '--subdir', 'DAGS_FOLDER/stitch-wootric.py', 
'--cfg-path', '/tmp/tmppngn04eh', '--error-file', '/tmp/tmpyn355ch4']
   [2022-11-17, 03:30:04 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv 
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:538: 
DeprecationWarning: The sql_alchemy_conn option in [core] has been moved to the 
sql_alchemy_conn option in [database] - the old setting has been used, but 
please update your config.
   [2022-11-17, 03:30:04 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   option = 
self._get_environment_variables(deprecated_key, deprecated_section, key, 
section)
   [2022-11-17, 03:30:04 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv 
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:538 
DeprecationWarning: The remote_logging option in [core] has been moved to the 
remote_logging option in [logging] - the old setting has been used, but please 
update your config.
   [2022-11-17, 03:30:06 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv [[34m2022-11-17, 03:30:06 CET[0m] 
{[34mdagbag.py:[0m508} INFO[0m - Filling up the DagBag from 
/opt/airflow/dags/stitch-wootric.py[0m
   [2022-11-17, 03:30:07 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv [[34m2022-11-17, 03:30:07 CET[0m] 
{[34mtask_command.py:[0m371} INFO[0m - Running <TaskInstance: 
singer_wootric_stitch.bash_create_and_init_wootric_venv 
scheduled__2022-11-16T02:30:00+00:00 [running]> on host 35409bc57fd4[0m
   [2022-11-17, 03:30:07 CET] {taskinstance.py:1583} INFO - Exporting the 
following env vars:
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=singer_wootric_stitch
   AIRFLOW_CTX_TASK_ID=bash_create_and_init_wootric_venv
   AIRFLOW_CTX_EXECUTION_DATE=2022-11-16T02:30:00+00:00
   AIRFLOW_CTX_TRY_NUMBER=1
   AIRFLOW_CTX_DAG_RUN_ID=scheduled__2022-11-16T02:30:00+00:00
   [2022-11-17, 03:30:07 CET] {subprocess.py:62} INFO - Tmp dir root location: 
    /tmp
   [2022-11-17, 03:30:07 CET] {subprocess.py:74} INFO - Running command: 
['/bin/bash', '-c', '\n    sudo python3 -m venv /opt/wootric && sudo 
/opt/wootric/bin/pip install -e /opt/airflow/connectors/singer/tap-wootric/\n   
 ']
   [2022-11-17, 03:30:07 CET] {subprocess.py:85} INFO - Output:
   [2022-11-17, 03:30:08 CET] {local_task_job.py:215} WARNING - Recorded pid 
11020 does not match the current pid 11022
   [2022-11-17, 03:30:08 CET] {process_utils.py:129} INFO - Sending 
Signals.SIGTERM to group 11022. PIDs of all processes in the group: [11038, 
11039, 11040, 11022]
   [2022-11-17, 03:30:08 CET] {process_utils.py:80} INFO - Sending the signal 
Signals.SIGTERM to group 11022
   [2022-11-17, 03:30:08 CET] {taskinstance.py:1553} ERROR - Received SIGTERM. 
Terminating subprocesses.
   [2022-11-17, 03:30:08 CET] {subprocess.py:103} INFO - Sending SIGTERM signal 
to process group
   [2022-11-17, 03:30:08 CET] {taskinstance.py:1902} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/operators/bash.py", 
line 191, in execute
       cwd=self.cwd,
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/subprocess.py", 
line 90, in run_command
       for raw_line in iter(self.sub_process.stdout.readline, b''):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py",
 line 1555, in signal_handler
       raise AirflowException("Task received SIGTERM signal")
   airflow.exceptions.AirflowException: Task received SIGTERM signal
   [2022-11-17, 03:30:08 CET] {taskinstance.py:1412} INFO - Marking task as 
FAILED. dag_id=singer_wootric_stitch, 
task_id=bash_create_and_init_wootric_venv, execution_date=20221116T023000, 
start_date=20221117T023003, end_date=20221117T023008
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv Traceback (most recent call last):
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/bin/airflow", line 8, in <module>
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     sys.exit(main())
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/__main__.py", line 
38, in main
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     args.func(args)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", 
line 51, in command
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     return func(*args, **kwargs)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/cli.py", line 
99, in wrapper
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     return f(*args, **kwargs)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py",
 line 377, in task_run
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     
_run_task_by_selected_method(args, dag, ti)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py",
 line 185, in _run_task_by_selected_method
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     _run_raw_task(args, ti)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py",
 line 262, in _run_raw_task
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     error_file=args.error_file,
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", 
line 71, in wrapper
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     return func(*args, 
session=session, **kwargs)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py",
 line 1463, in _run_raw_task
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     
self._execute_task_with_callbacks(context, test_mode)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py",
 line 1610, in _execute_task_with_callbacks
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     result = 
self._execute_task(context, task_orig)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py",
 line 1671, in _execute_task
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     result = 
execute_callable(context=context)
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/operators/bash.py", 
line 191, in execute
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     cwd=self.cwd,
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/subprocess.py", 
line 90, in run_command
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     for raw_line in 
iter(self.sub_process.stdout.readline, b''):
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv   File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py",
 line 1555, in signal_handler
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv     raise AirflowException("Task 
received SIGTERM signal")
   [2022-11-17, 03:30:08 CET] {base_task_runner.py:127} INFO - Job 4163: 
Subtask bash_create_and_init_wootric_venv airflow.exceptions.AirflowException: 
Task received SIGTERM signal
   [2022-11-17, 03:30:08 CET] {process_utils.py:75} INFO - Process 
psutil.Process(pid=11038, status='terminated', started='02:30:07') (11038) 
terminated with exit code None
   [2022-11-17, 03:30:08 CET] {process_utils.py:75} INFO - Process 
psutil.Process(pid=11039, status='terminated', started='02:30:07') (11039) 
terminated with exit code None
   [2022-11-17, 03:30:08 CET] {process_utils.py:75} INFO - Process 
psutil.Process(pid=11040, status='terminated', started='02:30:07') (11040) 
terminated with exit code None
   [2022-11-17, 03:30:09 CET] {process_utils.py:75} INFO - Process 
psutil.Process(pid=11022, status='terminated', exitcode=1, started='02:30:02') 
(11022) terminated with exit code 1
   ```
   
   ### What you think should happen instead
   
   Dag must get executed successfully without any sigterm signal.
   
   ### How to reproduce
   
   No response
   
   ### Operating System
   
   NAME="Ubuntu" VERSION="18.04.5 LTS (Bionic Beaver)"
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==6.0.0
   apache-airflow-providers-common-sql==1.2.0
   apache-airflow-providers-docker==3.3.0
   apache-airflow-providers-ftp==3.1.0
   apache-airflow-providers-google==8.4.0
   apache-airflow-providers-http==4.0.0
   apache-airflow-providers-imap==3.0.0
   apache-airflow-providers-mongo==3.0.0
   apache-airflow-providers-postgres==5.2.2
   apache-airflow-providers-slack==6.0.0
   apache-airflow-providers-sqlite==3.2.1
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   Attached docker-compose template use by Ansible to Deploy
   
   ```yaml
   ---
   version: '3'
   x-airflow-common:
     environment:
       &airflow-common-env
       AIRFLOW__CORE__EXECUTOR: CeleryExecutor
       AIRFLOW__CORE__SQL_ALCHEMY_CONN: "postgresql+psycopg2://{{ 
cloud_sql_username }}:{{ cloud_sql_password }}@{{ cloud_sql_host }}/airflow"
       AIRFLOW__CELERY__RESULT_BACKEND: "db+postgresql://{{ cloud_sql_username 
}}:{{ cloud_sql_password }}@{{ cloud_sql_host }}/airflow"
       AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
       AIRFLOW__CORE__FERNET_KEY: ''
       AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
       AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
       AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
       AIRFLOW_BASE_URL: "https://{{ server_name }}"
       AIRFLOW__WEBSERVER__ENABLE_PROXY_FIX: 'true'
       AIRFLOW_HOME: "{{ airflow_home }}"
       POSTGRES_USER: "{{ cloud_sql_username }}"
       POSTGRES_PASSWORD: "{{ cloud_sql_password }}"
       POSTGRES_DB: airflow
       WOOTRIC_USERNAME: "{{ wootric_username }}"
       WOOTRIC_PASSWORD: "{{ wootric_password }}"
       WOOTRIC_STITCH_TOKEN: "{{ wootric_stitch_token }}"
       FTP_USER: "{{ ftp_user }}"
       FTP_PASSWORD: "{{ ftp_password }}"
       FTP_URL: "{{ ftp_url }}"
       FTP_PORT: "{{ ftp_port }}"
       BQ_ENV: "{{ bq_env }}"
       GCS_BUCKET: "{{ gcs_bucket }}"
       AIRFLOW__CORE__REMOTE_LOGGING: 'false'
       AIRFLOW__DATABASE__LOAD_DEFAULT_CONNECTIONS: 'false'
       AIRFLOW__CELERY__WORKER_CONCURRENCY: '2'
       AIRFLOW__CELERY__WORKER_AUTOSCALE : '1,2'
       AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG: '1'
       AIRFLOW__SCHEDULER__SCHEDULE_AFTER_TASK_EXECUTION: 'False'
       AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL: "30"
       AIRFLOW__CORE__KILLED_TASK_CLEANUP_TIME: "604800"
       GOOGLE_APPLICATION_CREDENTIALS: "{{ airflow_home }}/dags/airflow.json"
       AIRFLOW__CELERY__WORKER_CONCURRENCY: '2'
       AIRFLOW__CELERY__WORKER_AUTOSCALE: '1,2'
       AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG: '1'
       
     volumes:
       - "{{ airflow_home }}/connectors:{{ airflow_home }}/connectors:z"
       - "{{ airflow_home }}/dags:{{ airflow_home }}/dags:z"
       - "{{ airflow_home }}/logs:{{ airflow_home }}/logs:z"
       - "{{ airflow_home }}/plugins:{{ airflow_home }}/plugins:z"
   
     user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
     depends_on:
       redis:
         condition: service_healthy
   
   services:
     redis:
       image: redis:latest
       ports:
         - 6379:6379
       networks:
         - airflow
       healthcheck:
         test: ["CMD", "redis-cli", "ping"]
         interval: 5s
         timeout: 30s
         retries: 50
       restart: always
   
     airflow-webserver:
       image:  yannibenoitiyeze/airflow:2.4.2
       container_name: airflow-webserver
       command: webserver
       networks:
         - traefik_default
         - airflow
       environment:
         <<: *airflow-common-env
       volumes:
           - "{{ airflow_home }}/connectors:{{ airflow_home }}/connectors:z"
           - "{{ airflow_home }}/dags:{{ airflow_home }}/dags:z"
           - "{{ airflow_home }}/logs:{{ airflow_home }}/logs:z"
           - "{{ airflow_home }}/plugins:{{ airflow_home }}/plugins:z"
       ports:
         - 8080:8080
       labels:
         - "traefik.enable=true"
         - "traefik.http.routers.whoami.rule=Host(`{{ server_name }}`)"
         - "traefik.docker.network=traefik_default"
         - "traefik.http.routers.whoami.entrypoints=websecure"
         - "traefik.http.routers.whoami.tls.certresolver=myresolver"
       healthcheck:
         test: ["CMD", "curl", "--fail", "http://localhost:8080/health";]
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
   
     airflow-scheduler:
       image:  yannibenoitiyeze/airflow:2.4.2
       command: scheduler
       networks:
         - airflow
       environment:
         <<: *airflow-common-env
       volumes:
         - "{{ airflow_home }}/connectors:{{ airflow_home }}/connectors:z"
         - "{{ airflow_home }}/dags:{{ airflow_home }}/dags:z"
         - "{{ airflow_home }}/logs:{{ airflow_home }}/logs:z"
         - "{{ airflow_home }}/plugins:{{ airflow_home }}/plugins:z"
       healthcheck:
         test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob 
--hostname "$${HOSTNAME}"']
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
   
     airflow-worker:
       image:  yannibenoitiyeze/airflow:2.4.2
       command: celery worker
       networks:
         - airflow
       volumes:
         - "{{ airflow_home }}/connectors:{{ airflow_home }}/connectors:z"
         - "{{ airflow_home }}/dags:{{ airflow_home }}/dags:z"
         - "{{ airflow_home }}/logs:{{ airflow_home }}/logs:z"
         - "{{ airflow_home }}/plugins:{{ airflow_home }}/plugins:z"
       environment:
         <<: *airflow-common-env
   
       healthcheck:
         test:
           - "CMD-SHELL"
           - 'celery --app airflow.executors.celery_executor.app inspect ping 
-d "celery@$${HOSTNAME}"'
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
   
     airflow-init:
       image:  yannibenoitiyeze/airflow:2.4.2
       command: version
       networks:
         - airflow
       volumes:
         - "{{ airflow_home }}/connectors:{{ airflow_home }}/connectors:z"
         - "{{ airflow_home }}/dags:{{ airflow_home }}/dags:z"
         - "{{ airflow_home }}/logs:{{ airflow_home }}/logs:z"
         - "{{ airflow_home }}/plugins:{{ airflow_home }}/plugins:z"
   
       environment:
         <<: *airflow-common-env
         _AIRFLOW_DB_UPGRADE: 'true'
         _AIRFLOW_WWW_USER_CREATE: 'true'
         _AIRFLOW_WWW_USER_USERNAME: "{{ airflow_username }}"
         _AIRFLOW_WWW_USER_PASSWORD: "{{ airflow_password }}"
   
     flower:
       image: yannibenoitiyeze/airflow:2.4.2
       command: celery flower
       networks:
         - airflow
       healthcheck:
         test: ["CMD", "curl", "--fail", "http://localhost:5555/";]
         interval: 10s
         timeout: 10s
         retries: 5
       environment:
         <<: *airflow-common-env
       restart: always
   
   networks:
     traefik_default:
       external:
         name: traefik_default
     airflow:
       name: airflow
       driver: bridge
   ```
   
   ### Anything else
   
   No response
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to