laserpedro edited a comment on issue #18041:
URL: https://github.com/apache/airflow/issues/18041#issuecomment-914227769
Hello,
I am facing the same issue:
airflow 2.1.3 (tested also with 2.1.2, 2.1.1)
backend: postgresql
executor: LocalExecutor
unixname: airflow
task default user (run_as_user) = airflow
I have modified the variables killed_task_cleanup_time and
schedule_after_task_execution to resp. 100000 and False.
I have also installed airflow as non root user and set the default
run_as_user to be airflow.
I have tries to remove the task instances to start from scratch for tasks
and also dag runs.
My tasks are getting constantly killed in backfill mode with the traceback:
```[2021-09-07 10:21:20,185] {process_utils.py:100} INFO - Sending
Signals.SIGTERM to GPID 73011```
Honestly, I am a bit discouraged at this point, could you help me please ?
tks
tasks logs:
```
Reading local file:
/home/airflow/airflow/logs/import_forex_unwind_trades/detect_book_referential/2021-08-05T00:00:00+00:00/28.log
[2021-09-07 11:18:19,826] {taskinstance.py:896} INFO - Dependencies all met
for <TaskInstance: import_forex_unwind_trades.detect_book_referential
2021-08-05T00:00:00+00:00 [queued]>
[2021-09-07 11:18:21,644] {taskinstance.py:896} INFO - Dependencies all met
for <TaskInstance: import_forex_unwind_trades.detect_book_referential
2021-08-05T00:00:00+00:00 [queued]>
[2021-09-07 11:18:21,644] {taskinstance.py:1087} INFO -
--------------------------------------------------------------------------------
[2021-09-07 11:18:21,644] {taskinstance.py:1088} INFO - Starting attempt 28
of 28
[2021-09-07 11:18:21,644] {taskinstance.py:1089} INFO -
--------------------------------------------------------------------------------
[2021-09-07 11:18:21,661] {taskinstance.py:1107} INFO - Executing
<Task(HttpSensor): detect_book_referential> on 2021-08-05T00:00:00+00:00
[2021-09-07 11:18:21,662] {base_task_runner.py:133} INFO - Running on host:
gvasrv-airflow
[2021-09-07 11:18:21,662] {base_task_runner.py:134} INFO - Running:
['airflow', 'tasks', 'run', 'import_forex_unwind_trades',
'detect_book_referential', '2021-08-05T00:00:00+00:00', '--job-id', '723490',
'--pool', 'default_pool', '--raw', '--subdir',
'DAGS_FOLDER/import_fx_unwind_trades.py', '--cfg-path', '/tmp/tmpe43x01zl',
'--error-file', '/tmp/tmpohmu9qb7']
[2021-09-07 11:18:24,493] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential [2021-09-07 11:18:24,492] {dagbag.py:496} INFO
- Filling up the DagBag from
/home/airflow/airflow/dags/import_fx_unwind_trades.py
[2021-09-07 11:18:26,045] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential
/home/airflow/anaconda3/envs/airflow_dev_py37/lib/python3.7/site-packages/airflow/providers/http/sensors/http.py:26
DeprecationWarning: This decorator is deprecated.
[2021-09-07 11:18:26,045] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential
[2021-09-07 11:18:26,045] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential In previous versions, all subclasses of
BaseOperator must use apply_default decorator for the`default_args` feature to
work properly.
[2021-09-07 11:18:26,045] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential
[2021-09-07 11:18:26,045] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential In current version, it is optional. The
decorator is applied automatically using the metaclass.
[2021-09-07 11:18:26,045] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential
[2021-09-07 11:18:32,281] {base_task_runner.py:118} INFO - Job 723490:
Subtask detect_book_referential [2021-09-07 11:18:32,280] {base.py:78} INFO -
Using connection to: id: alpflow_symph_conn. Host:
https://as.symphony.com/integration/v1/whi/simpleWebHookIntegration/, Port:
None, Schema: None, Login: None, Password: None, extra: {'webhook_token': }
[2021-09-07 11:18:33,857] {local_task_job.py:194} WARNING - Recorded pid
60749 does not match the current pid 98677
[2021-09-07 11:18:33,863] {process_utils.py:100} INFO - Sending
Signals.SIGTERM to GPID 98677
[2021-09-07 11:18:33,870] {process_utils.py:66} INFO - Process
psutil.Process(pid=98677, status='terminated', exitcode=<Negsignal.SIGTERM:
-15>, started='11:18:21') (98677) terminated with exit code Negsignal.SIGTERM
```
scheduler log:
`
[2021-09-07 11:33:13,489] {dagrun.py:429} ERROR - Marking run <DagRun
import_forex_unwind_trades @ 2021-08-11 00:00:00+00:00:
backfill__2021-08-11T00:00:00+00:00, externally triggered: False> failed
[2021-09-07 11:33:13,527] {backfill_job.py:388} INFO - [backfill progress] |
finished run 7 of 16 | tasks waiting: 0 | succeeded: 0 | running: 9 | failed:
39 | skipped: 0 | deadlocked: 0 | not ready: 0
[2021-09-07 11:33:14,819] {local_executor.py:124} ERROR - Failed to execute
task PID of job runner does not match.
`
This make airflow unusable for me since highly unstable.
Tks,
Pierre
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]