trlopes1974 commented on issue #39717:
URL: https://github.com/apache/airflow/issues/39717#issuecomment-2213359801
Detailed logs sequence of failing task.
See that the task 'dispatch_restores' was scheduled/queued at 2024-07-07
13:12:03,993 and marked as failed 10minutes later at 2024-07-07 13:22:59,976
I cannot find the '9a60d4ed-09b2-4e83-9b1d-58c47839a068' task id in
flower/celery and have no other logs to search in.
Any ideas?
**scheduler.stdout**
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [scheduled]>
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_correcao_dados
scheduled__2024-07-07T12:07:00+00:00 [scheduled]>
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [scheduled]>
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_correcao_dados
scheduled__2024-07-07T12:07:00+00:00 [scheduled]>
2024-07-07 13:12:03,851 INFO - Sending
TaskInstanceKey(dag_id='CSDISPATCHER_OTHERS', task_id='dispatch_restores',
run_id='scheduled__2024-07-07T12:07:00+00:00', try_number=1, map_index=-1) to
executor with priority 2 and queue default
2024-07-07 13:12:03,851 INFO - Adding to queue: ['airflow', 'tasks', 'run',
'CSDISPATCHER_OTHERS', 'dispatch_restores',
'scheduled__2024-07-07T12:07:00+00:00', '--local', '--subdir',
'DAGS_FOLDER/CSDISPATCHER_OTHERS.py']
2024-07-07 13:12:03,852 INFO - Sending
TaskInstanceKey(dag_id='CSDISPATCHER_OTHERS',
task_id='dispatch_correcao_dados',
run_id='scheduled__2024-07-07T12:07:00+00:00', try_number=1, map_index=-1) to
executor with priority 2 and queue default
2024-07-07 13:12:03,852 INFO - Adding to queue: ['airflow', 'tasks', 'run',
'CSDISPATCHER_OTHERS', 'dispatch_correcao_dados',
'scheduled__2024-07-07T12:07:00+00:00', '--local', '--subdir',
'DAGS_FOLDER/CSDISPATCHER_OTHERS.py']
2024-07-07 13:12:03,984 INFO - Received executor event with state queued for
task instance TaskInstanceKey(dag_id='CSDISPATCHER_OTHERS',
task_id='dispatch_restores', run_id='scheduled__2024-07-07T12:07:00+00:00',
try_number=1, map_index=-1)
2024-07-07 13:12:03,985 INFO - Received executor event with state queued for
task instance TaskInstanceKey(dag_id='CSDISPATCHER_OTHERS',
task_id='dispatch_correcao_dados',
run_id='scheduled__2024-07-07T12:07:00+00:00', try_number=1, map_index=-1)
2024-07-07 13:12:03,985 INFO - Received executor event with state success
for task instance TaskInstanceKey(dag_id='CSDISPATCHER_OTHERS',
task_id='Start', run_id='scheduled__2024-07-07T12:07:00+00:00', try_number=1,
map_index=-1)
2024-07-07 13:12:03,993 INFO - Setting external_id for <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> to 9a60d4ed-09b2-4e83-9b1d-58c47839a068
2024-07-07 13:12:03,993 INFO - Setting external_id for <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_correcao_dados
scheduled__2024-07-07T12:07:00+00:00 [queued]> to
2418fdad-ed1e-440d-afb5-1c79b52e1390
2024-07-07 13:12:03,994 INFO - TaskInstance Finished:
dag_id=CSDISPATCHER_OTHERS, task_id=Start,
run_id=scheduled__2024-07-07T12:07:00+00:00, map_index=-1,
run_start_date=2024-07-07 12:12:03.327772+00:00, run_end_date=2024-07-07
12:12:03.703420+00:00, run_duration=0.375648, state=success,
executor_state=success, try_number=1, max_tries=0, job_id=411092,
pool=default_pool, queue=default, priority_weight=4, operator=EmptyOperator,
queued_dttm=2024-07-07 12:12:00.189483+00:00, queued_by_job_id=408194,
pid=2687251
2024-07-07 13:12:04,038 DEBUG - <TaskInstance: CSDISPATCHER_OTHERS.End
scheduled__2024-07-07T12:07:00+00:00 [None]> dependency 'Trigger Rule' PASSED:
False, Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 2 non-success(es).
upstream_states=_UpstreamTIStates(success=0, skipped=0, failed=0,
upstream_failed=0, removed=0, done=0, success_setup=0, skipped_setup=0),
upstream_task_ids={'dispatch_correcao_dados', 'dispatch_restores'}
2024-07-07 13:12:04,039 DEBUG - Dependencies not met for <TaskInstance:
CSDISPATCHER_OTHERS.End scheduled__2024-07-07T12:07:00+00:00 [None]>,
dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success' requires
all upstream tasks to have succeeded, but found 2 non-success(es).
upstream_states=_UpstreamTIStates(success=0, skipped=0, failed=0,
upstream_failed=0, removed=0, done=0, success_setup=0, skipped_setup=0),
upstream_task_ids={'dispatch_correcao_dados', 'dispatch_restores'}
...
2024-07-07 13:22:53,014 DEBUG - <TaskInstance: CSDISPATCHER_OTHERS.End
scheduled__2024-07-07T12:07:00+00:00 [None]> dependency 'Previous Dagrun State'
PASSED: True, The task did not have depends_on_past set.
2024-07-07 13:22:53,014 DEBUG - <TaskInstance: CSDISPATCHER_OTHERS.End
scheduled__2024-07-07T12:07:00+00:00 [None]> dependency 'Not In Retry Period'
PASSED: True, The task instance was not marked for retrying.
2024-07-07 13:22:53,016 DEBUG - <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set.
2024-07-07 13:22:53,016 DEBUG - <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted.
...
2024-07-07 13:22:59,976 ERROR - Executor reports task instance
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [queued]> finished (failed) although the
task says it's queued. (Info: None) Was the task killed externally?
**dag processor manager log:**
[2024-07-07T13:22:59.985+0100] {manager.py:564} DEBUG - Received
{'full_filepath': '/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py',
'processor_subdir': '/home/ttauto/airflow/dags', 'msg': "Executor reports task
instance <TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [queued]> finished (failed) although the
task says it's queued. (Info: None) Was the task killed externally?",
'simple_task_instance': <airflow.models.taskinstance.SimpleTaskInstance object
at 0x7f13fd3bc550>, 'is_failure_callback': True} signal from
DagFileProcessorAgent
[2024-07-07T13:22:59.986+0100] {manager.py:719} DEBUG - Queuing
TaskCallbackRequest CallbackRequest: {'full_filepath':
'/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py', 'processor_subdir':
'/home/ttauto/airflow/dags', 'msg': "Executor reports task instance
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [queued]> finished (failed) although the
task says it's queued. (Info: None) Was the task killed externally?",
'simple_task_instance': <airflow.models.taskinstance.SimpleTaskInstance object
at 0x7f13fd3bc550>, 'is_failure_callback': True}
**scheduler logs:**
[2024-07-07T13:23:00.026+0100] {processor.py:161} INFO - Started process
(PID=2692915) to work on /home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py
[2024-07-07T13:23:00.027+0100] {processor.py:830} INFO - Processing file
/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py for tasks to queue
[2024-07-07T13:23:00.028+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.028+0100] {dagbag.py:545} INFO - Filling up the DagBag
from /home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py
[2024-07-07T13:23:00.030+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.030+0100] {dagbag.py:337} DEBUG - Importing
/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py
[2024-07-07T13:23:00.033+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.032+0100] {h_secrets_backend.py:208} DEBUG - _get_secret:
airflow_prod , variables/CSDISPATCHER_OTHERS_SCHEDULE
[2024-07-07T13:23:00.033+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.033+0100] {h_secrets_backend.py:117} DEBUG - init backend
with url:https://vault.kyndryl.net proxy:http://129.41.90.15:443
auth_type:approle
[2024-07-07T13:23:00.035+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.035+0100] {cachevaultmemcache.py:57} DEBUG - CA :
/opt/tkapp/ssl/slautop02_knb_combined.pem Host: memcache.slautop02.knb
[2024-07-07T13:23:00.704+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.703+0100] {h_secrets_backend.py:219} DEBUG - Got response
data (len): 1
[2024-07-07T13:23:00.704+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.704+0100] {h_secrets_backend.py:185} DEBUG -
get_variableCSDISPATCHER_OTHERS_SCHEDULE:{'value':
'2,7,12,17,22,27,32,37,42,47,52,57 * * * *'}
[2024-07-07T13:23:00.709+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.709+0100] {GetText.py:65} DEBUG -
/opt/tkapp/env_airflow/lib64/python3.9/site-packages/cron_descriptor/locale/en_US.mo
Loaded
[2024-07-07T13:23:00.728+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.728+0100] {dagbag.py:511} DEBUG - Loaded DAG <DAG:
CSDISPATCHER_OTHERS>
[2024-07-07T13:23:00.728+0100] {processor.py:840} INFO - DAG(s)
'CSDISPATCHER_OTHERS' retrieved from
/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py
[2024-07-07T13:23:00.729+0100] {processor.py:706} DEBUG - Processing
Callback Request: {'full_filepath':
'/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py', 'processor_subdir':
'/home/ttauto/airflow/dags', 'msg': "Executor reports task instance
<TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [queued]> finished (failed) although the
task says it's queued. (Info: None) Was the task killed externally?",
'simple_task_instance': <airflow.models.taskinstance.SimpleTaskInstance object
at 0x7f13fd3bc550>, 'is_failure_callback': True}
[2024-07-07T13:23:00.759+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.759+0100] {taskinstance.py:2907} ERROR - Executor reports
task instance <TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [queued]> finished (failed) although the
task says it's queued. (Info: None) Was the task killed externally?
[2024-07-07T13:23:00.767+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.767+0100] {taskinstance.py:562} DEBUG - Task Duration set
to None
[2024-07-07T13:23:00.769+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.769+0100] {taskinstance.py:584} DEBUG - Clearing
next_method and next_kwargs.
[2024-07-07T13:23:00.773+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.772+0100] {plugins_manager.py:332} DEBUG - Plugins are
already loaded. Skipping.
[2024-07-07T13:23:00.773+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.773+0100] {plugins_manager.py:516} DEBUG - Integrate DAG
plugins
[2024-07-07T13:23:00.789+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.788+0100] {taskinstance.py:1042} DEBUG -
previous_execution_date was called
[2024-07-07T13:23:00.800+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.800+0100] {taskinstance.py:1206} INFO - Marking task as
FAILED. dag_id=CSDISPATCHER_OTHERS, task_id=dispatch_restores,
run_id=scheduled__2024-07-07T12:07:00+00:00, execution_date=20240707T120700,
start_date=, end_date=20240707T122300
[2024-07-07T13:23:00.805+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.805+0100] {taskinstance.py:1042} DEBUG -
previous_execution_date was called
[2024-07-07T13:23:00.815+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.815+0100] {h_secrets_backend.py:208} DEBUG - _get_secret:
airflow_prod , connections/smtp_default
[2024-07-07T13:23:00.816+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.815+0100] {h_secrets_backend.py:219} DEBUG - Got response
data (len): 4
[2024-07-07T13:23:00.816+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.816+0100] {h_secrets_backend.py:258} DEBUG - a obter
connection para smtp_default : {'conn_type': 'Email', 'extra': {'disable_ssl':
True, 'disable_tls': False, 'from_email': '[email protected]',
'retry_limit': 5, 'ssl_context': 'default', 'timeout': 20}, 'host':
'158.98.137.90', 'port': '25'}
[2024-07-07T13:23:00.817+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.816+0100] {crypto.py:82} WARNING - empty cryptography key
- values will not be stored encrypted.
[2024-07-07T13:23:00.818+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.818+0100] {base.py:84} INFO - Using connection ID
'smtp_default' for task execution.
[2024-07-07T13:23:00.820+0100] {logging_mixin.py:188} WARNING -
/opt/tkapp/env_airflow/lib64/python3.9/site-packages/airflow/utils/email.py:154
RemovedInAirflow3Warning: Fetching SMTP credentials from configuration
variables will be deprecated in a future release. Please set credentials using
a connection instead.
[2024-07-07T13:23:00.820+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.820+0100] {configuration.py:1050} WARNING - section/key
[smtp/smtp_user] not found in config
[2024-07-07T13:23:00.820+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.820+0100] {email.py:267} DEBUG - No user/password found
for SMTP, so logging in with no authentication.
[2024-07-07T13:23:00.821+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.821+0100] {email.py:271} INFO - Email alerting: attempt 1
[2024-07-07T13:23:00.824+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.824+0100] {email.py:282} INFO - Sent an alert email to
['{{ var.json.DAG_FAILURE_EMAIL }}']
[2024-07-07T13:23:00.850+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.850+0100] {h_secrets_backend.py:208} DEBUG - _get_secret:
airflow_prod , variables/RABBIT
[2024-07-07T13:23:00.851+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.851+0100] {h_secrets_backend.py:219} DEBUG - Got response
data (len): 9
[2024-07-07T13:23:00.851+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.851+0100] {h_secrets_backend.py:185} DEBUG -
get_variableRABBIT:{'rabbitExchange': 'amq.topic', 'rabbitHost':
'rabbit.slautop02.knb', 'rabbitHost_OLD': 'slautop02-rabbit01',
'rabbitPassword': 'WkNXQUvJVk!9c2X', 'rabbitPort': '5672', 'rabbitUserName':
'airflow', 'rabbitconnection_attempts': '3', 'rabbitsocket_timeout': '5000',
'rabbitvhost': 'airflowhost'}
[2024-07-07T13:23:00.853+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.852+0100] {h_secrets_backend.py:208} DEBUG - _get_secret:
airflow_prod , variables/TKAPP
[2024-07-07T13:23:00.853+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.853+0100] {h_secrets_backend.py:219} DEBUG - Got response
data (len): 10
[2024-07-07T13:23:00.853+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.853+0100] {h_secrets_backend.py:185} DEBUG -
get_variableTKAPP:{'backupdir': 'backups', 'db_purge_interval': '90', 'dbname':
'tkapp', 'host': 'postgres.slautop02.knb', 'host_OLD': 'slautop02-postgres01',
'options': '-c search_path=tkapp', 'password': '1qazXSW"3edcVFR$', 'port':
'5432', 'retention': '5', 'user': 'ttauto'}
[2024-07-07T13:23:00.899+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.899+0100] {CSDISPATCHER_OTHERS.py:63} DEBUG - dag:
CSDISPATCHER_OTHERS, task: End, state: failed, url:
https://airflow.slautop02.knb:8080/dags/CSDISPATCHER_OTHERS/grid?dag_run_id=scheduled__2024-07-07T12%3A07%3A00%2B00%3A00&task_id=End&map_index=-1&tab=logs,
Exception: Executor reports task instance <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> finished (failed) although the task says it's queued. (Info: None)
Was the task killed externally?
[2024-07-07T13:23:00.900+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.900+0100] {CSDISPATCHER_OTHERS.py:63} DEBUG - dag:
CSDISPATCHER_OTHERS, task: Start, state: failed, url:
https://airflow.slautop02.knb:8080/dags/CSDISPATCHER_OTHERS/grid?dag_run_id=scheduled__2024-07-07T12%3A07%3A00%2B00%3A00&task_id=Start&map_index=-1&tab=logs,
Exception: Executor reports task instance <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> finished (failed) although the task says it's queued. (Info: None)
Was the task killed externally?
[2024-07-07T13:23:00.900+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.900+0100] {CSDISPATCHER_OTHERS.py:63} DEBUG - dag:
CSDISPATCHER_OTHERS, task: dispatch_correcao_dados, state: failed, url:
https://airflow.slautop02.knb:8080/dags/CSDISPATCHER_OTHERS/grid?dag_run_id=scheduled__2024-07-07T12%3A07%3A00%2B00%3A00&task_id=dispatch_correcao_dados&map_index=-1&tab=logs,
Exception: Executor reports task instance <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> finished (failed) although the task says it's queued. (Info: None)
Was the task killed externally?
[2024-07-07T13:23:00.901+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.901+0100] {CSDISPATCHER_OTHERS.py:63} DEBUG - dag:
CSDISPATCHER_OTHERS, task: dispatch_restores, state: failed, url:
https://airflow.slautop02.knb:8080/dags/CSDISPATCHER_OTHERS/grid?dag_run_id=scheduled__2024-07-07T12%3A07%3A00%2B00%3A00&task_id=dispatch_restores&map_index=-1&tab=logs,
Exception: Executor reports task instance <TaskInstance:
CSDISPATCHER_OTHERS.dispatch_restores scheduled__2024-07-07T12:07:00+00:00
[queued]> finished (failed) although the task says it's queued. (Info: None)
Was the task killed externally?
[2024-07-07T13:23:00.901+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.901+0100] {CSDISPATCHER_OTHERS.py:69} ERROR - A terminar
callback de erro!
[2024-07-07T13:23:00.908+0100] {processor.py:791} INFO - Executed failure
callback for <TaskInstance: CSDISPATCHER_OTHERS.dispatch_restores
scheduled__2024-07-07T12:07:00+00:00 [failed]> in state failed
[2024-07-07T13:23:00.909+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.909+0100] {dagbag.py:667} DEBUG - Running
dagbag.sync_to_db with retries. Try 1 of 3
[2024-07-07T13:23:00.910+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.910+0100] {dagbag.py:672} DEBUG - Calling the
DAG.bulk_sync_to_db method
[2024-07-07T13:23:00.912+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.912+0100] {serialized_dag.py:166} DEBUG - Checking if DAG
(CSDISPATCHER_OTHERS) changed
[2024-07-07T13:23:00.930+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.930+0100] {serialized_dag.py:177} DEBUG - Serialized DAG
(CSDISPATCHER_OTHERS) is unchanged. Skipping writing to DB
[2024-07-07T13:23:00.931+0100] {logging_mixin.py:188} INFO -
[2024-07-07T13:23:00.931+0100] {dag.py:3096} INFO - Sync 1 DAGs
[2024-07-07T13:23:00.963+0100] {processor.py:183} INFO - Processing
/home/ttauto/airflow/dags/CSDISPATCHER_OTHERS.py took 0.959 seconds
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]