shubhampatel94 commented on issue #17507:
URL: https://github.com/apache/airflow/issues/17507#issuecomment-1534588978

   I see the same issue happening on 2.4.3
   Multiple dags started getting terminated within range of 5 minutes after 
running fine for weeks due to this error.
   here is my config detail.
   
   Python 3.10
   Airflow 2.4.3
   MetadataDB: MySql 8
   Executor -> LocalExecutor
   run_as_user is not set. The config is empty.
   
   OS details.
   NAME="Ubuntu"
   VERSION="18.04.6 LTS (Bionic Beaver)"
   ID=ubuntu
   ID_LIKE=debian
   PRETTY_NAME="Ubuntu 18.04.6 LTS"
   VERSION_ID="18.04"
   HOME_URL="https://www.ubuntu.com/";
   SUPPORT_URL="https://help.ubuntu.com/";
   BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/";
   
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy";
   VERSION_CODENAME=bionic
   UBUNTU_CODENAME=bionic
   
   Seen this in the Tasks logs.
   ```
   [2023-05-04, 02:28:21 EDT] {local_task_job.py:205} WARNING - Recorded pid 
3738156 does not match the current pid 3634136
   [2023-05-04, 02:28:21 EDT] {process_utils.py:129} INFO - Sending 
Signals.SIGTERM to group 3634136. PIDs of all processes in the group: [3634136]
   [2023-05-04, 02:28:21 EDT] {process_utils.py:84} INFO - Sending the signal 
Signals.SIGTERM to group 3634136
   [2023-05-04, 02:28:21 EDT] {taskinstance.py:1562} ERROR - Received SIGTERM. 
Terminating subprocesses.
   ```
   
   Airflow Logs
   ```
   [2023-05-04 02:28:08,454] {manager.py:288} WARNING - DagFileProcessorManager 
(PID=3732979) exited with exit code 1 - re-launching
   [2023-05-04 02:28:08,469] {manager.py:163} INFO - Launched 
DagFileProcessorManager with pid: 3737637
   [2023-05-04 02:28:08,480] {settings.py:58} INFO - Configured default 
timezone Timezone('UTC')
   [2023-05-04 02:28:08,541] {scheduler_job.py:1380} INFO - Resetting orphaned 
tasks for active dag runs
   [2023-05-04 02:28:08,547] {scheduler_job.py:1403} INFO - Marked 1 
SchedulerJob instances as failed
   [2023-05-04 02:28:08,767] {scheduler_job.py:1444} INFO - Reset the following 
38 orphaned TaskInstances:
        <TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [queued]>
        <TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-04-29T08:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-04-30T08:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [running]>
        <TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [running]>
   [2023-05-04 02:28:13,780] {scheduler_job.py:346} INFO - 23 tasks up for 
execution:
        <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-30T08:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
   [2023-05-04 02:28:13,780] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,780] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 1/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 2/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 3/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 4/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 1/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 2/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 3/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 4/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 5/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 6/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 5/50 
running and queued tasks
   [2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 6/50 
running and queued tasks
   [2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:13,783] {scheduler_job.py:497} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-30T08:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
   [2023-05-04 02:28:13,960] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, try_number=2, map_index=-1) to executor with 
priority 7 and queue default
   [2023-05-04 02:28:13,961] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,961] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-30T08:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 4 and queue default
   [2023-05-04 02:28:13,961] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-30T08:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,962] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-30T20:00:00+00:00', 
try_number=5, map_index=-1) to executor with priority 3 and queue default
   [2023-05-04 02:28:13,962] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-30T20:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,962] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T01:03:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,962] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T01:03:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,963] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,963] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,963] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,963] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,964] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,964] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,964] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,964] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,964] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,965] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,965] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,965] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,965] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,966] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,966] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,966] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,966] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,967] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,967] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,967] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,967] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,967] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,968] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,968] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,968] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,968] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,969] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:13,969] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,969] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T02:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:13,969] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T02:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,970] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T20:05:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:13,970] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T20:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,970] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T04:05:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:13,970] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T04:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,970] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T05:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:13,971] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,971] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T06:20:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:13,971] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,999] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,999] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-30T08:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:13,999] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-30T20:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T01:03:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,001] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,001] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,003] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,003] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T02:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,005] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T20:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,005] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T04:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,005] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,010] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:14,099] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,108] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,110] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,127] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,134] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,137] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,152] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,194] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,237] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,247] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,248] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,250] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,251] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,253] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,255] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,257] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,257] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,275] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,293] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,304] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,318] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,324] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,332] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:14,517] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,591] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,595] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,597] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,682] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,733] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,757] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,771] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,782] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,790] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,822] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,840] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,844] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,845] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,847] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,882] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,893] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,902] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,918] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,935] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,937] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,939] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,952] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,973] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:14,979] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,015] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,024] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,040] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,044] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,054] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,075] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,079] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,090] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,095] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,102] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,136] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,139] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,183] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,190] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,225] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,228] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,240] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,248] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,259] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:15,282] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,315] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,353] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,384] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,412] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,444] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,445] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,528] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,568] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,583] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,669] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,669] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,707] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,710] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,779] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,779] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,796] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,861] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:15,952] {credentials.py:1255} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2023-05-04 02:28:15,990] {scheduler_job.py:346} INFO - 9 tasks up for 
execution:
        <TaskInstance: *** scheduled__2023-04-29T08:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
   [2023-05-04 02:28:15,995] {scheduler_job.py:411} INFO - DAG *** has 1/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 1/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:15,997] {scheduler_job.py:497} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: *** scheduled__2023-04-29T08:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
   [2023-05-04 02:28:16,014] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-29T08:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 4 and queue default
   [2023-05-04 02:28:16,015] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-29T08:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,015] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-02T20:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 3 and queue default
   [2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,016] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T05:05:00+00:00', 
try_number=2, map_index=-1) to executor with priority 3 and queue default
   [2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T05:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,016] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T05:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T05:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,016] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T00:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 2 and queue default
   [2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,017] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-30T19:30:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:16,017] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-30T19:30:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,017] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T19:36:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:16,017] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T19:36:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,017] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T00:05:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:16,017] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T00:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,018] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T05:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:16,018] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-29T08:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T05:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T05:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-30T19:30:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-03T19:36:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T00:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,053] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:16,256] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,340] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,348] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,355] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,399] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,410] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,434] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,434] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,446] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:16,589] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:16,703] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:16,828] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:16,941] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,047] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,246] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,257] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,284] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,306] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,320] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,423] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,450] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:17,455] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,477] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:17,682] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:17,894] {credentials.py:1255} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2023-05-04 02:28:17,913] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:18,055] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:18,196] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:18,288] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:18,317] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:18,734] {scheduler_job.py:346} INFO - 6 tasks up for 
execution:
        <TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
   [2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 1/50 
running and queued tasks
   [2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 
running and queued tasks
   [2023-05-04 02:28:18,735] {scheduler_job.py:497} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [scheduled]>
        <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
   [2023-05-04 02:28:18,812] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T01:05:00+00:00', 
try_number=1, map_index=-1) to executor with priority 4 and queue default
   [2023-05-04 02:28:18,812] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T01:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-16T00:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:18,813] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-16T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-23T00:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:18,813] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-23T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T04:30:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:18,813] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T04:30:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T06:00:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:18,814] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,814] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T06:20:00+00:00', 
try_number=2, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:18,814] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,861] {credentials.py:1255} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2023-05-04 02:28:18,871] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T01:05:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,872] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-16T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,872] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-04-23T00:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,872] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T04:30:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,873] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T06:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,873] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:18,983] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,088] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:19,110] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:19,132] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:19,133] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:19,135] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:19,160] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:19,171] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,212] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,286] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,287] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,365] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,431] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,453] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,492] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,499] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,531] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,540] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,580] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,581] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,608] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,648] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,664] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,670] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,680] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,723] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,753] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,765] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,765] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,775] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,786] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:19,794] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,803] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,822] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,830] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,874] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:19,889] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,963] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:19,998] {credentials.py:1255} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2023-05-04 02:28:20,029] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,079] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:20,085] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,128] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,179] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:20,204] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,241] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:20,240] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,298] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,300] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,327] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,352] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,396] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,406] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,441] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,460] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,489] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,491] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,545] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,545] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,598] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,607] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,660] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,666] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,696] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,737] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,750] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,783] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,829] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,865] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,928] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,955] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:20,989] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,014] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,064] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,091] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,107] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:21,150] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,287] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,482] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,643] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,844] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:21,979] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:22,140] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:22,324] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:22,492] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:22,646] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:22,790] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:22,881] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:23,260] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:23,499] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,629] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-04T00:05:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:23,627] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,640] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T00:05:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:18.756761+00:00, 
run_end_date=2023-05-04 06:28:22.505894+00:00, run_duration=3.74913, 
state=up_for_retry, executor_state=failed, try_number=1, max_tries=3, 
job_id=277262, pool=default_pool, queue=default, priority_weight=1, 
operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, 
queued_by_job_id=233035, pid=3738404
   [2023-05-04 02:28:23,664] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,675] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,696] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,737] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,791] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:23,801] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:24,815] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 
2023-05-04 05:00:00+00:00: scheduled__2023-05-04T05:00:00+00:00, state:running, 
queued_at: 2023-05-04 06:00:02.373050+00:00. externally triggered: False> failed
   [2023-05-04 02:28:24,816] {dagrun.py:644} INFO - DagRun *** dag_id=*** 
execution_date=2023-05-04 05:00:00+00:00, 
run_id=scheduled__2023-05-04T05:00:00+00:00, run_start_date=2023-05-04 
06:00:02.574912+00:00, run_end_date=2023-05-04 06:28:24.815624+00:00, 
run_duration=1702.240712, state=failed, external_trigger=False, 
run_type=scheduled, data_interval_start=2023-05-04 05:00:00+00:00, 
data_interval_end=2023-05-04 06:00:00+00:00, 
dag_hash=4d788176fc57f5eb934f9ab5b04a02db
   [2023-05-04 02:28:24,839] {dag.py:3336} INFO - Setting next_dagrun for *** 
to 2023-05-04T06:00:00+00:00, run_after=2023-05-04T07:00:00+00:00
   [2023-05-04 02:28:25,750] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-04T05:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-04T05:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-04T04:05:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:25,827] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T05:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:18.434524+00:00, 
run_end_date=2023-05-04 06:28:22.240840+00:00, run_duration=3.80632, 
state=up_for_retry, executor_state=failed, try_number=1, max_tries=2, 
job_id=277260, pool=default_pool, queue=default, priority_weight=1, 
operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, 
queued_by_job_id=233035, pid=3738390
   [2023-05-04 02:28:25,827] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T04:05:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:15.776811+00:00, 
run_end_date=2023-05-04 06:28:25.632014+00:00, run_duration=9.8552, 
state=up_for_retry, executor_state=failed, try_number=1, max_tries=2, 
job_id=277246, pool=default_pool, queue=default, priority_weight=1, 
operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738075
   [2023-05-04 02:28:25,827] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T05:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:15.996686+00:00, 
run_end_date=2023-05-04 06:28:22.783233+00:00, run_duration=6.78655, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277247, 
pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, 
queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, 
pid=3738153
   [2023-05-04 02:28:25,828] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:15.871250+00:00, 
run_end_date=2023-05-04 06:28:22.188521+00:00, run_duration=6.31727, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277245, 
pool=default_pool, queue=default, priority_weight=2, 
operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738084
   [2023-05-04 02:28:25,828] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:15.669887+00:00, 
run_end_date=2023-05-04 06:28:21.537265+00:00, run_duration=5.86738, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277239, 
pool=default_pool, queue=default, priority_weight=2, 
operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738064
   [2023-05-04 02:28:25,829] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:15.554503+00:00, 
run_end_date=2023-05-04 06:28:22.238525+00:00, run_duration=6.68402, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277240, 
pool=default_pool, queue=default, priority_weight=2, 
operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738047
   [2023-05-04 02:28:25,829] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:16.023921+00:00, 
run_end_date=2023-05-04 06:28:21.406309+00:00, run_duration=5.38239, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277253, 
pool=default_pool, queue=default, priority_weight=2, 
operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738156
   [2023-05-04 02:28:25,829] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:16.405103+00:00, 
run_end_date=2023-05-04 06:28:21.631632+00:00, run_duration=5.22653, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277252, 
pool=default_pool, queue=default, priority_weight=2, 
operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738180
   [2023-05-04 02:28:25,891] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:26,251] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:27,011] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 
2023-04-16 00:00:00+00:00: scheduled__2023-04-16T00:00:00+00:00, state:running, 
queued_at: 2023-05-01 00:00:06.474932+00:00. externally triggered: False> failed
   [2023-05-04 02:28:27,012] {dagrun.py:644} INFO - DagRun *** dag_id=*** 
execution_date=2023-04-16 00:00:00+00:00, 
run_id=scheduled__2023-04-16T00:00:00+00:00, run_start_date=2023-05-01 
00:00:06.606220+00:00, run_end_date=2023-05-04 06:28:27.012156+00:00, 
run_duration=282500.405936, state=failed, external_trigger=False, 
run_type=scheduled, data_interval_start=2023-04-16 00:00:00+00:00, 
data_interval_end=2023-05-01 00:00:00+00:00, 
dag_hash=71706b64a19537a7749aa8af2dd1f8e0
   [2023-05-04 02:28:27,021] {dag.py:3336} INFO - Setting next_dagrun for *** 
to 2023-05-01T00:00:00+00:00, run_after=2023-05-16T00:00:00+00:00
   [2023-05-04 02:28:28,203] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T20:05:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:28,203] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-04-16T00:00:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:28,212] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-04-16T00:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:20.053437+00:00, 
run_end_date=2023-05-04 06:28:25.618675+00:00, run_duration=5.56524, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277266, 
pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, 
queued_dttm=2023-05-04 06:28:18.736412+00:00, queued_by_job_id=233035, 
pid=3738527
   [2023-05-04 02:28:28,213] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T20:05:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:16.315525+00:00, 
run_end_date=2023-05-04 06:28:25.669738+00:00, run_duration=9.35421, 
state=up_for_retry, executor_state=failed, try_number=1, max_tries=2, 
job_id=277256, pool=default_pool, queue=default, priority_weight=1, 
operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, 
queued_by_job_id=233035, pid=3738170
   [2023-05-04 02:28:29,882] {scheduler_job.py:346} INFO - 1 tasks up for 
execution:
        <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
   [2023-05-04 02:28:29,882] {scheduler_job.py:411} INFO - DAG *** has 1/50 
running and queued tasks
   [2023-05-04 02:28:29,882] {scheduler_job.py:497} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
   [2023-05-04 02:28:29,887] {scheduler_job.py:536} INFO - Sending 
TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-02T20:00:00+00:00', 
try_number=1, map_index=-1) to executor with priority 1 and queue default
   [2023-05-04 02:28:29,887] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:29,905] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', '***', '***', 
'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/***.py']
   [2023-05-04 02:28:30,010] {dagbag.py:537} INFO - Filling up the DagBag from 
/mnt/alpha/airflow/dags/***.py
   [2023-05-04 02:28:31,085] {base.py:71} INFO - Using connection ID ssh_*** 
for task execution.
   [2023-05-04 02:28:31,720] {task_command.py:376} INFO - Running 
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [queued]> on host ***
   [2023-05-04 02:28:32,005] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-02T20:00:00+00:00 exited with status 
success for try_number 1
   [2023-05-04 02:28:32,014] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-02T20:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:18.191068+00:00, 
run_end_date=2023-05-04 06:28:28.899988+00:00, run_duration=10.7089, 
state=failed, executor_state=success, try_number=1, max_tries=1, job_id=277259, 
pool=default_pool, queue=default, priority_weight=3, operator=SSHOperator, 
queued_dttm=2023-05-04 06:28:15.998301+00:00, queued_by_job_id=233035, 
pid=3738373
   [2023-05-04 02:28:38,357] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-02T20:00:00+00:00 exited with status 
success for try_number 1
   [2023-05-04 02:28:38,396] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-02T20:00:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:31.867672+00:00, 
run_end_date=2023-05-04 06:28:33.320185+00:00, run_duration=1.45251, 
state=success, executor_state=success, try_number=1, max_tries=1, 
job_id=277272, pool=default_pool, queue=default, priority_weight=1, 
operator=PythonOperator, queued_dttm=2023-05-04 06:28:29.883257+00:00, 
queued_by_job_id=233035, pid=3739407
   [2023-05-04 02:28:40,472] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:41,951] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 
2023-05-02 20:00:00+00:00: scheduled__2023-05-02T20:00:00+00:00, state:running, 
queued_at: 2023-05-03 20:00:01.228399+00:00. externally triggered: False> failed
   [2023-05-04 02:28:41,952] {dagrun.py:644} INFO - DagRun *** dag_id=*** 
execution_date=2023-05-02 20:00:00+00:00, 
run_id=scheduled__2023-05-02T20:00:00+00:00, run_start_date=2023-05-03 
20:00:01.913706+00:00, run_end_date=2023-05-04 06:28:41.952025+00:00, 
run_duration=37720.038319, state=failed, external_trigger=False, 
run_type=scheduled, data_interval_start=2023-05-02 20:00:00+00:00, 
data_interval_end=2023-05-03 20:00:00+00:00, 
dag_hash=07a45c48201bfc14afb9d7e4c643bbdb
   [2023-05-04 02:28:41,980] {dag.py:3336} INFO - Setting next_dagrun for *** 
to 2023-05-03T20:00:00+00:00, run_after=2023-05-04T20:00:00+00:00
   [2023-05-04 02:28:42,422] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:42,502] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-03T01:03:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:42,502] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-04T06:20:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:42,511] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T01:03:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:16.302772+00:00, 
run_end_date=2023-05-04 06:28:38.024822+00:00, run_duration=21.722, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277255, 
pool=default_pool, queue=default, priority_weight=2, operator=SSHOperator, 
queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, 
pid=3738168
   [2023-05-04 02:28:42,511] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T06:20:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:16.282425+00:00, 
run_end_date=2023-05-04 06:28:41.372880+00:00, run_duration=25.0905, 
state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277251, 
pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, 
queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, 
pid=3738169
   [2023-05-04 02:28:42,661] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   [2023-05-04 02:28:43,636] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 
2023-05-04 06:20:00+00:00: scheduled__2023-05-04T06:20:00+00:00, state:running, 
queued_at: 2023-05-04 06:25:33.747936+00:00. externally triggered: False> failed
   [2023-05-04 02:28:43,637] {dagrun.py:644} INFO - DagRun *** dag_id=*** 
execution_date=2023-05-04 06:20:00+00:00, 
run_id=scheduled__2023-05-04T06:20:00+00:00, run_start_date=2023-05-04 
06:25:33.806284+00:00, run_end_date=2023-05-04 06:28:43.637144+00:00, 
run_duration=189.83086, state=failed, external_trigger=False, 
run_type=scheduled, data_interval_start=2023-05-04 06:20:00+00:00, 
data_interval_end=2023-05-04 06:25:00+00:00, 
dag_hash=c6c806de8a6d29d37d214803630a2667
   [2023-05-04 02:28:43,646] {dag.py:3336} INFO - Setting next_dagrun for *** 
to 2023-05-04T06:25:00+00:00, run_after=2023-05-04T06:30:00+00:00
   [2023-05-04 02:28:44,586] {scheduler_job.py:588} INFO - Executor reports 
execution of *** run_id=scheduled__2023-05-04T05:05:00+00:00 exited with status 
failed for try_number 1
   [2023-05-04 02:28:44,595] {scheduler_job.py:631} INFO - TaskInstance 
Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T05:05:00+00:00, 
map_index=-1, run_start_date=2023-05-04 06:28:21.528300+00:00, 
run_end_date=2023-05-04 06:28:42.518193+00:00, run_duration=20.9899, 
state=up_for_retry, executor_state=failed, try_number=1, max_tries=3, 
job_id=277270, pool=default_pool, queue=default, priority_weight=3, 
operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, 
queued_by_job_id=233035, pid=3738666
   [2023-05-04 02:28:46,942] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 
2023-05-03 01:03:00+00:00: scheduled__2023-05-03T01:03:00+00:00, state:running, 
queued_at: 2023-05-04 01:03:04.048892+00:00. externally triggered: False> failed
   [2023-05-04 02:28:46,942] {dagrun.py:644} INFO - DagRun *** dag_id=*** 
execution_date=2023-05-03 01:03:00+00:00, 
run_id=scheduled__2023-05-03T01:03:00+00:00, run_start_date=2023-05-04 
01:03:04.421634+00:00, run_end_date=2023-05-04 06:28:46.942643+00:00, 
run_duration=19542.521009, state=failed, external_trigger=False, 
run_type=scheduled, data_interval_start=2023-05-03 01:03:00+00:00, 
data_interval_end=2023-05-04 01:03:00+00:00, 
dag_hash=5567f4afd9ffba286e0af33d0abbabd7
   [2023-05-04 02:28:46,985] {dag.py:3336} INFO - Setting next_dagrun for *** 
to 2023-05-04T01:03:00+00:00, run_after=2023-05-05T01:03:00+00:00
   [2023-05-04 02:28:47,298] {local_executor.py:130} ERROR - Failed to execute 
task PID of job runner does not match.
   Traceback (most recent call last):
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py",
 line 126, in _execute_work_in_fork
       args.func(args)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", 
line 52, in command
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 
103, in wrapper
       return f(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 382, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 189, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
 line 247, in _run_task_by_local_task_job
       run_job.run()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 247, in run
       self._execute()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 135, in _execute
       self.heartbeat()
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", 
line 228, in heartbeat
       self.heartbeat_callback(session=session)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", 
line 72, in wrapper
       return func(*args, **kwargs)
     File 
"/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py",
 line 208, in heartbeat_callback
       raise AirflowException("PID of job runner does not match")
   airflow.exceptions.AirflowException: PID of job runner does not match
   
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to