Subhashini2610 commented on issue #29031:
URL: https://github.com/apache/airflow/issues/29031#issuecomment-1399407688

   I have now increased the time before the scheduler pod gets restarted during 
a liveness probe failure. I have rechecked the Postgres instance for failed 
connections but the metrics suggest there aren't any. 
   
   When the scheduler heartbeat stopped, I ran the following commands on the 
scheduler pod:
   ```
   (airflow)airflow jobs check --job-type SchedulerJob --hostname "$(hostname)"
   
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:386: 
FutureWarning: The auth_backends setting in [api] has had 
airflow.api.auth.backend.session added in the running config, which is needed 
by the UI. Please update your config before Apache Airflow 3.0.
     FutureWarning,
   [2023-01-22T05:18:37.385+0000] {settings.py:266} DEBUG - Setting up DB 
connection pool (PID 1447)
   [2023-01-22T05:18:37.385+0000] {settings.py:377} DEBUG - 
settings.prepare_engine_args(): Using pool settings. pool_size=5, 
max_overflow=10, pool_recycle=1800, pid=1447
   [2023-01-22T05:18:37.422+0000] {cli_action_loggers.py:39} DEBUG - Adding 
<function default_action_log at 0x7f23bc60a170> to pre execution callback
   
/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py:905 
DeprecationWarning: The namespace option in [kubernetes] has been moved to the 
namespace option in [kubernetes_executor] - the old setting has been used, but 
please update your config.
   /usr/local/lib/python3.7/configparser.py:407 FutureWarning: The config 
section [kubernetes] has been renamed to [kubernetes_executor]. Please update 
your `conf.get*` call to use the new name
   No alive jobs found.
   [2023-01-22T05:18:38.512+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1447)
   ```
   
   
   
   I have also tried a DB check:
   ```
   (airflow)airflow db check
   
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:386: 
FutureWarning: The auth_backends setting in [api] has had 
airflow.api.auth.backend.session added in the running config, which is needed 
by the UI. Please update your config before Apache Airflow 3.0.
     FutureWarning,
   [2023-01-22T05:19:29.002+0000] {settings.py:266} DEBUG - Setting up DB 
connection pool (PID 1457)
   [2023-01-22T05:19:29.002+0000] {settings.py:377} DEBUG - 
settings.prepare_engine_args(): Using pool settings. pool_size=5, 
max_overflow=10, pool_recycle=1800, pid=1457
   [2023-01-22T05:19:29.039+0000] {cli_action_loggers.py:39} DEBUG - Adding 
<function default_action_log at 0x7fc0cdde2170> to pre execution callback
   
/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py:905 
DeprecationWarning: The namespace option in [kubernetes] has been moved to the 
namespace option in [kubernetes_executor] - the old setting has been used, but 
please update your config.
   /usr/local/lib/python3.7/configparser.py:407 FutureWarning: The config 
section [kubernetes] has been renamed to [kubernetes_executor]. Please update 
your `conf.get*` call to use the new name
   [2023-01-22T05:19:29.148+0000] {cli_action_loggers.py:65} DEBUG - Calling 
callbacks: [<function default_action_log at 0x7fc0cdde2170>]
   [2023-01-22T05:19:29.439+0000] {db.py:1720} INFO - Connection successful.
   [2023-01-22T05:19:29.440+0000] {cli_action_loggers.py:83} DEBUG - Calling 
callbacks: []
   [2023-01-22T05:19:29.440+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1457)
   ```
   
   Below are the logs of the scheduler when the scheduler stops heartbeating:
   ```
   [2023-01-22T04:54:55.378+0000] {base_job.py:229} DEBUG - [heartbeat]
   [2023-01-22T04:54:55.382+0000] {scheduler_job.py:898} DEBUG - Next timed 
event is in 3.344295
   [2023-01-22T04:54:55.383+0000] {scheduler_job.py:900} DEBUG - Ran scheduling 
loop in 0.06 seconds
   [2023-01-22T04:54:56.384+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._create_dagruns_for_dags with retries. Try 1 of 3
   [2023-01-22T04:54:56.394+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:56.404+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:56.405+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._schedule_all_dag_runs with retries. Try 1 of 3
   [2023-01-22T04:54:56.417+0000] {scheduler_job.py:355} DEBUG - No tasks to 
consider for execution.
   [2023-01-22T04:54:56.421+0000] {base_executor.py:163} DEBUG - 0 running task 
instances
   [2023-01-22T04:54:56.421+0000] {base_executor.py:164} DEBUG - 0 in queue
   [2023-01-22T04:54:56.421+0000] {base_executor.py:165} DEBUG - 32 open slots
   [2023-01-22T04:54:56.422+0000] {base_executor.py:174} DEBUG - Calling the 
<class 'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
   [2023-01-22T04:54:56.422+0000] {kubernetes_executor.py:371} DEBUG - Syncing 
KubernetesExecutor
   [2023-01-22T04:54:56.422+0000] {kubernetes_executor.py:293} DEBUG - 
KubeJobWatcher alive, continuing
   [2023-01-22T04:54:56.422+0000] {kubernetes_executor.py:670} DEBUG - Next 
timed event is in 14.543564
   [2023-01-22T04:54:56.423+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:56.423+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:56.433+0000] {scheduler_job.py:898} DEBUG - Next timed 
event is in 2.293801
   [2023-01-22T04:54:56.433+0000] {scheduler_job.py:900} DEBUG - Ran scheduling 
loop in 0.05 seconds
   [2023-01-22T04:54:57.434+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._create_dagruns_for_dags with retries. Try 1 of 3
   [2023-01-22T04:54:57.445+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:57.456+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:57.456+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._schedule_all_dag_runs with retries. Try 1 of 3
   [2023-01-22T04:54:57.469+0000] {scheduler_job.py:355} DEBUG - No tasks to 
consider for execution.
   [2023-01-22T04:54:57.473+0000] {base_executor.py:163} DEBUG - 0 running task 
instances
   [2023-01-22T04:54:57.473+0000] {base_executor.py:164} DEBUG - 0 in queue
   [2023-01-22T04:54:57.473+0000] {base_executor.py:165} DEBUG - 32 open slots
   [2023-01-22T04:54:57.473+0000] {base_executor.py:174} DEBUG - Calling the 
<class 'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
   [2023-01-22T04:54:57.473+0000] {kubernetes_executor.py:371} DEBUG - Syncing 
KubernetesExecutor
   [2023-01-22T04:54:57.473+0000] {kubernetes_executor.py:293} DEBUG - 
KubeJobWatcher alive, continuing
   [2023-01-22T04:54:57.474+0000] {kubernetes_executor.py:670} DEBUG - Next 
timed event is in 13.492197
   [2023-01-22T04:54:57.474+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:57.474+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:57.484+0000] {scheduler_job.py:898} DEBUG - Next timed 
event is in 1.242456
   [2023-01-22T04:54:57.484+0000] {scheduler_job.py:900} DEBUG - Ran scheduling 
loop in 0.05 seconds
   [2023-01-22T04:54:58.486+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._create_dagruns_for_dags with retries. Try 1 of 3
   [2023-01-22T04:54:58.497+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:58.507+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:58.507+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._schedule_all_dag_runs with retries. Try 1 of 3
   [2023-01-22T04:54:58.519+0000] {scheduler_job.py:355} DEBUG - No tasks to 
consider for execution.
   [2023-01-22T04:54:58.523+0000] {base_executor.py:163} DEBUG - 0 running task 
instances
   [2023-01-22T04:54:58.523+0000] {base_executor.py:164} DEBUG - 0 in queue
   [2023-01-22T04:54:58.524+0000] {base_executor.py:165} DEBUG - 32 open slots
   [2023-01-22T04:54:58.524+0000] {base_executor.py:174} DEBUG - Calling the 
<class 'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
   [2023-01-22T04:54:58.524+0000] {kubernetes_executor.py:371} DEBUG - Syncing 
KubernetesExecutor
   [2023-01-22T04:54:58.524+0000] {kubernetes_executor.py:293} DEBUG - 
KubeJobWatcher alive, continuing
   [2023-01-22T04:54:58.525+0000] {kubernetes_executor.py:670} DEBUG - Next 
timed event is in 12.441311
   [2023-01-22T04:54:58.525+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:58.525+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:58.535+0000] {scheduler_job.py:898} DEBUG - Next timed 
event is in 0.191392
   [2023-01-22T04:54:58.536+0000] {scheduler_job.py:900} DEBUG - Ran scheduling 
loop in 0.05 seconds
   [2023-01-22T04:54:58.728+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._create_dagruns_for_dags with retries. Try 1 of 3
   [2023-01-22T04:54:58.737+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:58.747+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:58.748+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._schedule_all_dag_runs with retries. Try 1 of 3
   [2023-01-22T04:54:58.762+0000] {scheduler_job.py:355} DEBUG - No tasks to 
consider for execution.
   [2023-01-22T04:54:58.766+0000] {base_executor.py:163} DEBUG - 0 running task 
instances
   [2023-01-22T04:54:58.766+0000] {base_executor.py:164} DEBUG - 0 in queue
   [2023-01-22T04:54:58.766+0000] {base_executor.py:165} DEBUG - 32 open slots
   [2023-01-22T04:54:58.766+0000] {base_executor.py:174} DEBUG - Calling the 
<class 'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
   [2023-01-22T04:54:58.766+0000] {kubernetes_executor.py:371} DEBUG - Syncing 
KubernetesExecutor
   [2023-01-22T04:54:58.766+0000] {kubernetes_executor.py:293} DEBUG - 
KubeJobWatcher alive, continuing
   [2023-01-22T04:54:58.767+0000] {kubernetes_executor.py:670} DEBUG - Next 
timed event is in 12.199200
   [2023-01-22T04:54:58.777+0000] {scheduler_job.py:1513} DEBUG - Finding 
'running' jobs without a recent heartbeat
   [2023-01-22T04:54:58.785+0000] {scheduler_job.py:898} DEBUG - Next timed 
event is in 1.341696
   [2023-01-22T04:54:58.785+0000] {scheduler_job.py:900} DEBUG - Ran scheduling 
loop in 0.06 seconds
   [2023-01-22T04:54:59.786+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._create_dagruns_for_dags with retries. Try 1 of 3
   [2023-01-22T04:54:59.796+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:59.806+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:54:59.807+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._schedule_all_dag_runs with retries. Try 1 of 3
   [2023-01-22T04:54:59.819+0000] {scheduler_job.py:355} DEBUG - No tasks to 
consider for execution.
   [2023-01-22T04:54:59.823+0000] {base_executor.py:163} DEBUG - 0 running task 
instances
   [2023-01-22T04:54:59.823+0000] {base_executor.py:164} DEBUG - 0 in queue
   [2023-01-22T04:54:59.823+0000] {base_executor.py:165} DEBUG - 32 open slots
   [2023-01-22T04:54:59.824+0000] {base_executor.py:174} DEBUG - Calling the 
<class 'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
   [2023-01-22T04:54:59.824+0000] {kubernetes_executor.py:371} DEBUG - Syncing 
KubernetesExecutor
   [2023-01-22T04:54:59.824+0000] {kubernetes_executor.py:293} DEBUG - 
KubeJobWatcher alive, continuing
   [2023-01-22T04:54:59.824+0000] {kubernetes_executor.py:670} DEBUG - Next 
timed event is in 11.141761
   [2023-01-22T04:54:59.824+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:59.825+0000] {manager.py:275} DEBUG - Received message of 
type DagParsingStat
   [2023-01-22T04:54:59.835+0000] {scheduler_job.py:898} DEBUG - Next timed 
event is in 0.291539
   [2023-01-22T04:54:59.835+0000] {scheduler_job.py:900} DEBUG - Ran scheduling 
loop in 0.05 seconds
   [2023-01-22T04:55:00.127+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._create_dagruns_for_dags with retries. Try 1 of 3
   [2023-01-22T04:55:00.149+0000] {dag.py:3427} INFO - Setting next_dagrun for 
hello_world to 2023-01-22T04:55:00+00:00, run_after=2023-01-22T05:00:00+00:00
   [2023-01-22T04:55:00.154+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:55:00.169+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._get_next_dagruns_to_examine with retries. Try 1 of 3
   [2023-01-22T04:55:00.170+0000] {retries.py:84} DEBUG - Running 
SchedulerJob._schedule_all_dag_runs with retries. Try 1 of 3
   [2023-01-22T04:55:00.177+0000] {scheduler_job.py:1338} DEBUG - DAG 
hello_world not changed structure, skipping dagrun.verify_integrity
   [2023-01-22T04:55:00.180+0000] {dagrun.py:687} DEBUG - number of tis tasks 
for <DagRun hello_world @ 2023-01-22 04:50:00+00:00: 
scheduled__2023-01-22T04:50:00+00:00, state:running, queued_at: 2023-01-22 
04:55:00.138390+00:00. externally triggered: False>: 1 task(s)
   [2023-01-22T04:55:00.180+0000] {dagrun.py:708} DEBUG - number of 
scheduleable tasks for <DagRun hello_world @ 2023-01-22 04:50:00+00:00: 
scheduled__2023-01-22T04:50:00+00:00, state:running, queued_at: 2023-01-22 
04:55:00.138390+00:00. externally triggered: False>: 1 task(s)
   [2023-01-22T04:55:00.180+0000] {taskinstance.py:1102} DEBUG - <TaskInstance: 
hello_world.hello_task scheduled__2023-01-22T04:50:00+00:00 [None]> dependency 
'Trigger Rule' PASSED: True, The task instance did not have any upstream tasks.
   [2023-01-22T04:55:00.180+0000] {taskinstance.py:1102} DEBUG - <TaskInstance: 
hello_world.hello_task scheduled__2023-01-22T04:50:00+00:00 [None]> dependency 
'Previous Dagrun State' PASSED: True, The task did not have depends_on_past set.
   [2023-01-22T04:55:00.180+0000] {taskinstance.py:1102} DEBUG - <TaskInstance: 
hello_world.hello_task scheduled__2023-01-22T04:50:00+00:00 [None]> dependency 
'Not In Retry Period' PASSED: True, The task instance was not marked for 
retrying.
   [2023-01-22T04:55:00.181+0000] {taskinstance.py:1087} DEBUG - Dependencies 
all met for <TaskInstance: hello_world.hello_task 
scheduled__2023-01-22T04:50:00+00:00 [None]>
   [2023-01-22T04:55:00.189+0000] {scheduler_job.py:1362} DEBUG - Skipping SLA 
check for <DAG: hello_world> because no tasks in DAG have SLAs
   [2023-01-22T04:55:00.189+0000] {scheduler_job.py:1354} DEBUG - callback is 
empty
   [2023-01-22T04:55:00.198+0000] {scheduler_job.py:361} INFO - 1 tasks up for 
execution:
        <TaskInstance: hello_world.hello_task 
scheduled__2023-01-22T04:50:00+00:00 [scheduled]>
   [2023-01-22T04:55:00.199+0000] {scheduler_job.py:429} INFO - DAG hello_world 
has 0/16 running and queued tasks
   [2023-01-22T04:55:00.199+0000] {scheduler_job.py:511} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: hello_world.hello_task 
scheduled__2023-01-22T04:50:00+00:00 [scheduled]>
   [2023-01-22T04:55:00.201+0000] {scheduler_job.py:550} INFO - Sending 
TaskInstanceKey(dag_id='hello_world', task_id='hello_task', 
run_id='scheduled__2023-01-22T04:50:00+00:00', try_number=1, map_index=-1) to 
executor with priority 1 and queue default
   [2023-01-22T04:55:00.201+0000] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', 'hello_world', 'hello_task', 
'scheduled__2023-01-22T04:50:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/hello_world.py']
   [2023-01-22T04:55:00.204+0000] {base_executor.py:163} DEBUG - 0 running task 
instances
   [2023-01-22T04:55:00.204+0000] {base_executor.py:164} DEBUG - 1 in queue
   [2023-01-22T04:55:00.204+0000] {base_executor.py:165} DEBUG - 32 open slots
   [2023-01-22T04:55:00.204+0000] {kubernetes_executor.py:559} DEBUG - Add task 
TaskInstanceKey(dag_id='hello_world', task_id='hello_task', 
run_id='scheduled__2023-01-22T04:50:00+00:00', try_number=1, map_index=-1) with 
command ['airflow', 'tasks', 'run', 'hello_world', 'hello_task', 
'scheduled__2023-01-22T04:50:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/hello_world.py'], executor_config {}
   [2023-01-22T04:55:00.205+0000] {base_executor.py:174} DEBUG - Calling the 
<class 'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
   [2023-01-22T04:55:00.205+0000] {kubernetes_executor.py:585} DEBUG - 
self.running: {TaskInstanceKey(dag_id='hello_world', task_id='hello_task', 
run_id='scheduled__2023-01-22T04:50:00+00:00', try_number=1, map_index=-1)}
   [2023-01-22T04:55:00.205+0000] {kubernetes_executor.py:371} DEBUG - Syncing 
KubernetesExecutor
   [2023-01-22T04:55:00.205+0000] {kubernetes_executor.py:293} DEBUG - 
KubeJobWatcher alive, continuing
   [2023-01-22T04:55:00.250+0000] {kubernetes_executor.py:339} INFO - Creating 
kubernetes pod for job is TaskInstanceKey(dag_id='hello_world', 
task_id='hello_task', run_id='scheduled__2023-01-22T04:50:00+00:00', 
try_number=1, map_index=-1), with pod name 
hello-world-hello-task-b9c494d4d03640e6b7e158fee576aff4
   [2023-01-22T04:55:00.250+0000] {kubernetes_executor.py:340} DEBUG - 
Kubernetes running for command ['airflow', 'tasks', 'run', 'hello_world', 
'hello_task', 'scheduled__2023-01-22T04:50:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/hello_world.py']
   [2023-01-22T04:55:00.250+0000] {kubernetes_executor.py:341} DEBUG - 
Kubernetes launching image ecedevweuaftst1acr.azurecr.io/airflow:0.0.1
   [2023-01-22T04:55:00.251+0000] {kubernetes_executor.py:267} DEBUG - Pod 
Creation Request: 
   {
     "apiVersion": "v1",
     "kind": "Pod",
     "metadata": {
       "annotations": {
         "dag_id": "hello_world",
         "task_id": "hello_task",
         "try_number": "1",
         "run_id": "scheduled__2023-01-22T04:50:00+00:00"
       },
       "labels": {
         "tier": "airflow",
         "component": "worker",
         "release": "airflow",
         "airflow-worker": "302",
         "dag_id": "hello_world",
         "task_id": "hello_task",
         "try_number": "1",
         "airflow_version": "2.5.0",
         "kubernetes_executor": "True",
         "run_id": "scheduled__2023-01-22T0450000000-5c9c499a2"
       },
       "name": "hello-world-hello-task-b9c494d4d03640e6b7e158fee576aff4",
       "namespace": "airflow"
     },
     "spec": {
       "affinity": {},
       "containers": [
         {
           "args": [
             "airflow",
             "tasks",
             "run",
             "hello_world",
             "hello_task",
             "scheduled__2023-01-22T04:50:00+00:00",
             "--local",
             "--subdir",
             "DAGS_FOLDER/hello_world.py"
           ],
           "env": [
             {
               "name": "AIRFLOW__CORE__EXECUTOR",
               "value": "LocalExecutor"
             },
             {
               "name": "AIRFLOW__CORE__FERNET_KEY",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "fernet-key",
                   "name": "afl-fernet-secret"
                 }
               }
             },
             {
               "name": "AIRFLOW__CORE__SQL_ALCHEMY_CONN",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflow-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW__DATABASE__SQL_ALCHEMY_CONN",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflow-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW_CONN_AIRFLOW_DB",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflow-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW__WEBSERVER__SECRET_KEY",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "webserver-secret-key",
                   "name": "afl-webserver-secret"
                 }
               }
             },
             {
               "name": "AZURE_TENANT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AZURE_TENANT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": "AZURE_CLIENT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AZURE_CLIENT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": "AZURE_CLIENT_SECRET",
               "value": "xxxxxxxx"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AZURE_CLIENT_SECRET",
               "value": "xxxxxxxx"
             },
             {
               "name": "AIRFLOW__SECRETS__BACKEND",
               "value": 
"airflow.providers.microsoft.azure.secrets.key_vault.AzureKeyVaultBackend"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__SECRETS__BACKEND",
               "value": 
"airflow.providers.microsoft.azure.secrets.key_vault.AzureKeyVaultBackend"
             },
             {
               "name": "AIRFLOW__SECRETS__BACKEND_KWARGS",
               "value": "{\"connections_prefix\": \"\", \"variables_prefix\": 
\"af\", \"vault_url\": \"https://something-kv.vault.azure.net\";, \"sep\": 
\"-\"}"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__SECRETS__BACKEND_KWARGS",
               "value": "{\"connections_prefix\": \"\", \"variables_prefix\": 
\"af\", \"vault_url\": \"https://something-kv.vault.azure.net\";, \"sep\": 
\"-\"}"
             },
             {
               "name": "client_id",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "client_id",
                   "name": "oauth"
                 }
               }
             },
             {
               "name": "client_secret",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "client_secret",
                   "name": "oauth"
                 }
               }
             },
             {
               "name": "AIRFLOW__KUBERNETES_SECRETS__client_id",
               "value": "oauth=client_id"
             },
             {
               "name": "AIRFLOW__KUBERNETES_SECRETS__client_secret",
               "value": "oauth=client_secret"
             },
             {
               "name": "AIRFLOW_IS_K8S_EXECUTOR_POD",
               "value": "True"
             }
           ],
           "image": "ecedevweuaftst1acr.azurecr.io/airflow:0.0.1",
           "imagePullPolicy": "IfNotPresent",
           "name": "base",
           "resources": {},
           "volumeMounts": [
             {
               "mountPath": "/opt/airflow/logs",
               "name": "logs"
             },
             {
               "mountPath": "/opt/airflow/airflow.cfg",
               "name": "config",
               "readOnly": true,
               "subPath": "airflow.cfg"
             },
             {
               "mountPath": "/opt/airflow/config/airflow_local_settings.py",
               "name": "config",
               "readOnly": true,
               "subPath": "airflow_local_settings.py"
             },
             {
               "mountPath": "/opt/airflow/dags",
               "name": "dags",
               "readOnly": false
             }
           ]
         }
       ],
       "imagePullSecrets": [
         {
           "name": "airflow-registry"
         }
       ],
       "restartPolicy": "Never",
       "securityContext": {
         "fsGroup": 0,
         "runAsUser": 50000
       },
       "serviceAccountName": "airflow-worker",
       "volumes": [
         {
           "name": "dags",
           "persistentVolumeClaim": {
             "claimName": "dag-pvc"
           }
         },
         {
           "name": "logs",
           "persistentVolumeClaim": {
             "claimName": "log-pvc"
           }
         },
         {
           "configMap": {
             "name": "airflow-airflow-config"
           },
           "name": "config"
         }
       ]
     }
   }
   [2023-01-22T04:55:14.432+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1185)
   [2023-01-22T04:55:14.554+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1189)
   [2023-01-22T04:55:14.777+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor162-Process, stopped)>
   [2023-01-22T04:55:15.013+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor163-Process, stopped)>
   [2023-01-22T04:55:15.159+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1199)
   [2023-01-22T04:55:15.546+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor164-Process, stopped)>
   [2023-01-22T04:55:45.717+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1206)
   [2023-01-22T04:55:45.770+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1210)
   [2023-01-22T04:55:46.046+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor165-Process, stopped)>
   [2023-01-22T04:55:46.389+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor166-Process, stopped)>
   [2023-01-22T04:55:46.423+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1220)
   [2023-01-22T04:55:46.805+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor167-Process, stopped)>
   [2023-01-22T04:56:17.079+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1231)
   [2023-01-22T04:56:17.095+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1227)
   [2023-01-22T04:56:17.493+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor169-Process, stopped)>
   [2023-01-22T04:56:17.747+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor168-Process, stopped)>
   [2023-01-22T04:56:17.901+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1241)
   [2023-01-22T04:56:18.215+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor170-Process, stopped)>
   [2023-01-22T04:56:48.375+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1248)
   [2023-01-22T04:56:48.450+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1252)
   [2023-01-22T04:56:48.727+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor171-Process, stopped)>
   [2023-01-22T04:56:48.955+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor172-Process, stopped)>
   [2023-01-22T04:56:49.107+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1262)
   [2023-01-22T04:56:49.487+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor173-Process, stopped)>
   [2023-01-22T04:57:11.692+0000] {serialized_dag.py:245} DEBUG - Deleting 
Serialized DAGs (for which DAG files are deleted) from serialized_dag table 
   [2023-01-22T04:57:11.698+0000] {dag.py:3317} DEBUG - Deactivating DAGs (for 
which DAG files are deleted) from dag table 
   [2023-01-22T04:57:11.707+0000] {dagcode.py:136} DEBUG - Deleting code from 
dag_code table 
   [2023-01-22T04:57:19.075+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1269)
   [2023-01-22T04:57:19.493+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor174-Process, stopped)>
   [2023-01-22T04:57:19.827+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1276)
   [2023-01-22T04:57:19.885+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1280)
   [2023-01-22T04:57:20.150+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor175-Process, stopped)>
   [2023-01-22T04:57:20.407+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor176-Process, stopped)>
   [2023-01-22T04:57:50.106+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1290)
   [2023-01-22T04:57:50.423+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor177-Process, stopped)>
   [2023-01-22T04:58:22.358+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1301)
   [2023-01-22T04:58:22.363+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1297)
   [2023-01-22T04:58:22.714+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor179-Process, stopped)>
   [2023-01-22T04:58:22.951+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor178-Process, stopped)>
   [2023-01-22T04:58:23.122+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1311)
   [2023-01-22T04:58:23.469+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor180-Process, stopped)>
   [2023-01-22T04:58:53.648+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1318)
   [2023-01-22T04:58:53.665+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1322)
   [2023-01-22T04:58:54.013+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor181-Process, stopped)>
   [2023-01-22T04:58:54.254+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor182-Process, stopped)>
   [2023-01-22T04:58:54.487+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1332)
   [2023-01-22T04:58:54.804+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor183-Process, stopped)>
   [2023-01-22T04:59:24.862+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1363)
   [2023-01-22T04:59:24.938+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1359)
   [2023-01-22T04:59:25.190+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor185-Process, stopped)>
   [2023-01-22T04:59:25.423+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor184-Process, stopped)>
   [2023-01-22T04:59:25.657+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1373)
   [2023-01-22T04:59:25.994+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor186-Process, stopped)>
   [2023-01-22T04:59:55.647+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1391)
   [2023-01-22T04:59:56.046+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1395)
   [2023-01-22T04:59:56.050+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor187-Process, stopped)>
   [2023-01-22T04:59:56.385+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor188-Process, stopped)>
   [2023-01-22T04:59:56.474+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1405)
   [2023-01-22T04:59:56.892+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor189-Process, stopped)>
   [2023-01-22T05:00:26.559+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1412)
   [2023-01-22T05:00:26.945+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor190-Process, stopped)>
   [2023-01-22T05:00:27.014+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1419)
   [2023-01-22T05:00:27.215+0000] {settings.py:407} DEBUG - Disposing DB 
connection pool (PID 1423)
   [2023-01-22T05:00:27.341+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor191-Process, stopped)>
   [2023-01-22T05:00:27.623+0000] {processor.py:296} DEBUG - Waiting for 
<ForkProcess(DagFileProcessor192-Process, stopped)>
   [2023-01-22T05:00:33.734+0000] {kubernetes_executor.py:274} ERROR - 
Exception when attempting to create Namespaced Pod: {
     "apiVersion": "v1",
     "kind": "Pod",
     "metadata": {
       "annotations": {
         "dag_id": "hello_world",
         "task_id": "hello_task",
         "try_number": "1",
         "run_id": "scheduled__2023-01-22T04:50:00+00:00"
       },
       "labels": {
         "tier": "airflow",
         "component": "worker",
         "release": "airflow",
         "airflow-worker": "302",
         "dag_id": "hello_world",
         "task_id": "hello_task",
         "try_number": "1",
         "airflow_version": "2.5.0",
         "kubernetes_executor": "True",
         "run_id": "scheduled__2023-01-22T0450000000-5c9c499a2"
       },
       "name": "hello-world-hello-task-b9c494d4d03640e6b7e158fee576aff4",
       "namespace": "airflow"
     },
     "spec": {
       "affinity": {},
       "containers": [
         {
           "args": [
             "airflow",
             "tasks",
             "run",
             "hello_world",
             "hello_task",
             "scheduled__2023-01-22T04:50:00+00:00",
             "--local",
             "--subdir",
             "DAGS_FOLDER/hello_world.py"
           ],
           "env": [
             {
               "name": "AIRFLOW__CORE__EXECUTOR",
               "value": "LocalExecutor"
             },
             {
               "name": "AIRFLOW__CORE__FERNET_KEY",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "fernet-key",
                   "name": "afl-fernet-secret"
                 }
               }
             },
             {
               "name": "AIRFLOW__CORE__SQL_ALCHEMY_CONN",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflow-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW__DATABASE__SQL_ALCHEMY_CONN",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflow-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW_CONN_AIRFLOW_DB",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflow-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW__WEBSERVER__SECRET_KEY",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "webserver-secret-key",
                   "name": "afl-webserver-secret"
                 }
               }
             },
             {
               "name": "AZURE_TENANT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AZURE_TENANT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": "AZURE_CLIENT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AZURE_CLIENT_ID",
               "value": "xxxxxxxx"
             },
             {
               "name": "AZURE_CLIENT_SECRET",
               "value": "xxxxxxxx"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AZURE_CLIENT_SECRET",
               "value": "xxxxxxxx"
             },
             {
               "name": "AIRFLOW__SECRETS__BACKEND",
               "value": 
"airflow.providers.microsoft.azure.secrets.key_vault.AzureKeyVaultBackend"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__SECRETS__BACKEND",
               "value": 
"airflow.providers.microsoft.azure.secrets.key_vault.AzureKeyVaultBackend"
             },
             {
               "name": "AIRFLOW__SECRETS__BACKEND_KWARGS",
               "value": "{\"connections_prefix\": \"\", \"variables_prefix\": 
\"af\", \"vault_url\": \"something-kv.vault.azure.net\", \"sep\": \"-\"}"
             },
             {
               "name": 
"AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__SECRETS__BACKEND_KWARGS",
               "value": "{\"connections_prefix\": \"\", \"variables_prefix\": 
\"af\", \"vault_url\": \"https://something-kv.vault.azure.net\";, \"sep\": 
\"-\"}"
             },
             {
               "name": "client_id",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "client_id",
                   "name": "oauth"
                 }
               }
             },
             {
               "name": "client_secret",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "client_secret",
                   "name": "oauth"
                 }
               }
             },
             {
               "name": "AIRFLOW__KUBERNETES_SECRETS__client_id",
               "value": "oauth=client_id"
             },
             {
               "name": "AIRFLOW__KUBERNETES_SECRETS__client_secret",
               "value": "oauth=client_secret"
             },
             {
               "name": "AIRFLOW_IS_K8S_EXECUTOR_POD",
               "value": "True"
             }
           ],
           "image": "ecedevweuaftst1acr.azurecr.io/airflow:0.0.1",
           "imagePullPolicy": "IfNotPresent",
           "name": "base",
           "resources": {},
           "volumeMounts": [
             {
               "mountPath": "/opt/airflow/logs",
               "name": "logs"
             },
             {
               "mountPath": "/opt/airflow/airflow.cfg",
               "name": "config",
               "readOnly": true,
               "subPath": "airflow.cfg"
             },
             {
               "mountPath": "/opt/airflow/config/airflow_local_settings.py",
               "name": "config",
               "readOnly": true,
               "subPath": "airflow_local_settings.py"
             },
             {
               "mountPath": "/opt/airflow/dags",
               "name": "dags",
               "readOnly": false
             }
           ]
         }
       ],
       "imagePullSecrets": [
         {
           "name": "airflow-registry"
         }
       ],
       "restartPolicy": "Never",
       "securityContext": {
         "fsGroup": 0,
         "runAsUser": 50000
       },
       "serviceAccountName": "airflow-worker",
       "volumes": [
         {
           "name": "dags",
           "persistentVolumeClaim": {
             "claimName": "dag-pvc"
           }
         },
         {
           "name": "logs",
           "persistentVolumeClaim": {
             "claimName": "log-pvc"
           }
         },
         {
           "configMap": {
             "name": "airflow-airflow-config"
           },
           "name": "config"
         }
       ]
     }
   }
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 710, in urlopen
       chunked=chunked,
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 449, in _make_request
       six.raise_from(e, None)
     File "<string>", line 3, in raise_from
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 444, in _make_request
       httplib_response = conn.getresponse()
     File "/usr/local/lib/python3.7/http/client.py", line 1373, in getresponse
       response.begin()
     File "/usr/local/lib/python3.7/http/client.py", line 319, in begin
       version, status, reason = self._read_status()
     File "/usr/local/lib/python3.7/http/client.py", line 288, in _read_status
       raise RemoteDisconnected("Remote end closed connection without"
   http.client.RemoteDisconnected: Remote end closed connection without response
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/kubernetes_executor.py",
 line 270, in run_pod_async
       body=sanitized_pod, namespace=pod.metadata.namespace, **kwargs
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api/core_v1_api.py",
 line 7356, in create_namespaced_pod
       return self.create_namespaced_pod_with_http_info(namespace, body, 
**kwargs)  # noqa: E501
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api/core_v1_api.py",
 line 7469, in create_namespaced_pod_with_http_info
       collection_formats=collection_formats)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api_client.py",
 line 353, in call_api
       _preload_content, _request_timeout, _host)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api_client.py",
 line 184, in __call_api
       _request_timeout=_request_timeout)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api_client.py",
 line 397, in request
       body=body)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/rest.py", 
line 281, in POST
       body=body)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/rest.py", 
line 173, in request
       headers=headers)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/request.py", line 79, 
in request
       method, url, fields=fields, headers=headers, **urlopen_kw
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/request.py", line 
170, in request_encode_body
       return self.urlopen(method, url, **extra_kw)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/poolmanager.py", line 
376, in urlopen
       response = conn.urlopen(method, u.request_uri, **kw)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 788, in urlopen
       method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/util/retry.py", line 
550, in increment
       raise six.reraise(type(error), error, _stacktrace)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/packages/six.py", 
line 769, in reraise
       raise value.with_traceback(tb)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 710, in urlopen
       chunked=chunked,
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 449, in _make_request
       six.raise_from(e, None)
     File "<string>", line 3, in raise_from
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 444, in _make_request
       httplib_response = conn.getresponse()
     File "/usr/local/lib/python3.7/http/client.py", line 1373, in getresponse
       response.begin()
     File "/usr/local/lib/python3.7/http/client.py", line 319, in begin
       version, status, reason = self._read_status()
     File "/usr/local/lib/python3.7/http/client.py", line 288, in _read_status
       raise RemoteDisconnected("Remote end closed connection without"
   urllib3.exceptions.ProtocolError: ('Connection aborted.', 
RemoteDisconnected('Remote end closed connection without response'))
   [2023-01-22T05:00:33.740+0000] {scheduler_job.py:776} ERROR - Exception when 
executing SchedulerJob._run_scheduler_loop
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 710, in urlopen
       chunked=chunked,
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 449, in _make_request
       six.raise_from(e, None)
     File "<string>", line 3, in raise_from
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 444, in _make_request
       httplib_response = conn.getresponse()
     File "/usr/local/lib/python3.7/http/client.py", line 1373, in getresponse
       response.begin()
     File "/usr/local/lib/python3.7/http/client.py", line 319, in begin
       version, status, reason = self._read_status()
     File "/usr/local/lib/python3.7/http/client.py", line 288, in _read_status
       raise RemoteDisconnected("Remote end closed connection without"
   http.client.RemoteDisconnected: Remote end closed connection without response
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
 line 759, in _execute
       self._run_scheduler_loop()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
 line 887, in _run_scheduler_loop
       self.executor.heartbeat()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/base_executor.py",
 line 175, in heartbeat
       self.sync()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/kubernetes_executor.py",
 line 632, in sync
       self.kube_scheduler.run_next(task)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/kubernetes_executor.py",
 line 344, in run_next
       self.run_pod_async(pod, **self.kube_config.kube_client_request_args)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/kubernetes_executor.py",
 line 275, in run_pod_async
       raise e
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/kubernetes_executor.py",
 line 270, in run_pod_async
       body=sanitized_pod, namespace=pod.metadata.namespace, **kwargs
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api/core_v1_api.py",
 line 7356, in create_namespaced_pod
       return self.create_namespaced_pod_with_http_info(namespace, body, 
**kwargs)  # noqa: E501
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api/core_v1_api.py",
 line 7469, in create_namespaced_pod_with_http_info
       collection_formats=collection_formats)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api_client.py",
 line 353, in call_api
       _preload_content, _request_timeout, _host)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api_client.py",
 line 184, in __call_api
       _request_timeout=_request_timeout)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/api_client.py",
 line 397, in request
       body=body)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/rest.py", 
line 281, in POST
       body=body)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/client/rest.py", 
line 173, in request
       headers=headers)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/request.py", line 79, 
in request
       method, url, fields=fields, headers=headers, **urlopen_kw
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/request.py", line 
170, in request_encode_body
       return self.urlopen(method, url, **extra_kw)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/poolmanager.py", line 
376, in urlopen
       response = conn.urlopen(method, u.request_uri, **kw)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 788, in urlopen
       method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/util/retry.py", line 
550, in increment
       raise six.reraise(type(error), error, _stacktrace)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/packages/six.py", 
line 769, in reraise
       raise value.with_traceback(tb)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 710, in urlopen
       chunked=chunked,
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 449, in _make_request
       six.raise_from(e, None)
     File "<string>", line 3, in raise_from
     File 
"/home/airflow/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", 
line 444, in _make_request
       httplib_response = conn.getresponse()
     File "/usr/local/lib/python3.7/http/client.py", line 1373, in getresponse
       response.begin()
     File "/usr/local/lib/python3.7/http/client.py", line 319, in begin
       version, status, reason = self._read_status()
     File "/usr/local/lib/python3.7/http/client.py", line 288, in _read_status
       raise RemoteDisconnected("Remote end closed connection without"
   urllib3.exceptions.ProtocolError: ('Connection aborted.', 
RemoteDisconnected('Remote end closed connection without response'))
   [2023-01-22T05:00:33.741+0000] {kubernetes_executor.py:842} INFO - Shutting 
down Kubernetes executor
   [2023-01-22T05:00:33.741+0000] {kubernetes_executor.py:843} DEBUG - Flushing 
task_queue...
   [2023-01-22T05:00:33.741+0000] {kubernetes_executor.py:797} DEBUG - Executor 
shutting down, task_queue approximate size=0
   [2023-01-22T05:00:33.741+0000] {kubernetes_executor.py:845} DEBUG - Flushing 
result_queue...
   [2023-01-22T05:00:33.741+0000] {kubernetes_executor.py:810} DEBUG - Executor 
shutting down, result_queue approximate size=0
   [2023-01-22T05:00:33.742+0000] {kubernetes_executor.py:408} DEBUG - 
Terminating kube_watcher...
   [2023-01-22T05:00:33.747+0000] {kubernetes_executor.py:411} DEBUG - 
kube_watcher=<KubernetesJobWatcher(KubernetesJobWatcher-4, stopped)>
   [2023-01-22T05:00:33.748+0000] {kubernetes_executor.py:412} DEBUG - Flushing 
watcher_queue...
   [2023-01-22T05:00:33.748+0000] {kubernetes_executor.py:396} DEBUG - Executor 
shutting down, watcher_queue approx. size=0
   [2023-01-22T05:00:33.748+0000] {kubernetes_executor.py:416} DEBUG - Shutting 
down manager...
   [2023-01-22T05:00:34.761+0000] {process_utils.py:133} INFO - Sending 
Signals.SIGTERM to group 44. PIDs of all processes in the group: [44]
   [2023-01-22T05:00:34.761+0000] {process_utils.py:84} INFO - Sending the 
signal Signals.SIGTERM to group 44
   [2023-01-22T05:00:35.214+0000] {process_utils.py:79} INFO - Process 
psutil.Process(pid=44, status='terminated', exitcode=0, started='04:27:08') 
(44) terminated with exit code 0
   [2023-01-22T05:00:35.215+0000] {scheduler_job.py:788} INFO - Exited execute 
loop
   [2023-01-22T05:00:35.373+0000] {cli_action_loggers.py:83} DEBUG - Calling 
callbacks: []
   
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to