potiuk commented on PR #35221:
URL: https://github.com/apache/airflow/pull/35221#issuecomment-1783285180

   This is how it manifested itself (just checked it was the same in the 
failing jobs in our CI):
   
   
   ````
   [2023-10-26T14:04:35.397+0000] {scheduler_job_runner.py:641} INFO - Sending 
TaskInstanceKey(dag_id='clear_subdag_test_dag', task_id='daily_job', 
run_id='scheduled__2016-01-01T00:00:00+00:00', try_number=1, map_index=-1) to 
executor with priority 2 and queue default
   [2023-10-26T14:04:35.398+0000] {base_executor.py:146} INFO - Adding to 
queue: ['airflow', 'tasks', 'run', 'clear_subdag_test_dag', 'daily_job', 
'scheduled__2016-01-01T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/test_clear_subdag.py']
   [2023-10-26T14:04:35.398+0000] {scheduler_job_runner.py:641} INFO - Sending 
TaskInstanceKey(dag_id='clear_subdag_test_dag', task_id='daily_job_irrelevant', 
run_id='scheduled__2016-01-01T00:00:00+00:00', try_number=1, map_index=-1) to 
executor with priority 1 and queue default
   [2023-10-26T14:04:35.398+0000] {base_executor.py:146} INFO - Adding to 
queue: ['airflow', 'tasks', 'run', 
   'clear_subdag_test_dag', 'daily_job_irrelevant', 
'scheduled__2016-01-01T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/test_clear_subdag.py']
   [2023-10-26T14:04:35.399+0000] {scheduler_job_runner.py:641} INFO - Sending 
TaskInstanceKey(dag_id='test_retry_handling_job', 
task_id='test_retry_handling_op', 
run_id='scheduled__2016-10-05T19:00:00+00:00', try_number=1, map_index=-1) to 
executor with priority 1 and queue default
   [2023-10-26T14:04:35.399+0000] {base_executor.py:146} INFO - Adding to 
queue: ['airflow', 'tasks', 'run', 'test_retry_handling_job', 
'test_retry_handling_op', 'scheduled__2016-10-05T19:00:00+00:00', '--local', 
'--subdir', 'DAGS_FOLDER/test_retry_handling_job.py']
   [2023-10-26T14:04:35.407+0000] {local_executor.py:89} INFO - 
QueuedLocalWorker running ['airflow', 'tasks', 'run', 'clear_subdag_test_dag', 
'daily_job', 'scheduled__2016-01-01T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/test_clear_subdag.py']
   [2023-10-26T14:04:35.414+0000] {local_executor.py:89} INFO - 
QueuedLocalWorker running ['airflow', 'tasks', 'run', 'clear_subdag_test_dag', 
'daily_job_irrelevant', 'scheduled__2016-01-01T00:00:00+00:00', '--local', 
'--subdir', 'DAGS_FOLDER/test_clear_subdag.py']
   [2023-10-26T14:04:35.420+0000] {local_executor.py:89} INFO - 
QueuedLocalWorker running ['airflow', 'tasks', 'run', 
'test_retry_handling_job', 'test_retry_handling_op', 
'scheduled__2016-10-05T19:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/test_retry_handling_job.py']
   ````
   
   The first three lines with "test_clear_subdag" were there and they should 
not be - because scheduler should only run "test_retry_handling_job"). But on a 
fast machine, the DAGFileProcessor managed to start and serialize ONE more DAG.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to