jedcunningham commented on code in PR #30727:
URL: https://github.com/apache/airflow/pull/30727#discussion_r1181623885


##########
airflow/executors/kubernetes_executor.py:
##########
@@ -702,6 +291,8 @@ def sync(self) -> None:
         for _ in range(self.kube_config.worker_pods_creation_batch_size):
             try:
                 task = self.task_queue.get_nowait()
+                from kubernetes.client.rest import ApiException

Review Comment:
   Do we really want to do this in the loop? Probably just splitting hairs here.



##########
airflow/executors/kubernetes_executor.py:
##########
@@ -621,6 +223,8 @@ def execute_async(
             self.log.info("Add task %s with command %s", key, command)
 
         try:
+            from airflow.kubernetes.pod_generator import PodGenerator

Review Comment:
   Can we do this outside the try? If this failed (unlikely) it'd hide the real 
failure.
   
   nit: It makes the most sense to me to put these right at the start of the 
method.



##########
airflow/executors/kubernetes_executor.py:
##########
@@ -914,8 +510,10 @@ def adopt_launched_task(
             self.log.error("attempting to adopt taskinstance which was not 
specified by database: %s", ti_key)
             return
 
-        new_worker_id_label = 
pod_generator.make_safe_label_value(self.scheduler_job_id)
+        new_worker_id_label = 
self._make_safe_label_value(self.scheduler_job_id)
         try:
+            from kubernetes.client.rest import ApiException

Review Comment:
   nit: import this outside of the try



##########
airflow/executors/kubernetes_executor_utils.py:
##########
@@ -0,0 +1,466 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import json
+import multiprocessing
+import time
+from queue import Empty, Queue
+from typing import TYPE_CHECKING, Any
+
+from kubernetes import client, watch
+from kubernetes.client import Configuration, models as k8s
+from kubernetes.client.rest import ApiException
+from urllib3.exceptions import ReadTimeoutError
+
+from airflow.exceptions import AirflowException
+from airflow.kubernetes.kube_client import get_kube_client
+from airflow.kubernetes.kubernetes_helper_functions import annotations_to_key, 
create_pod_id
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.state import State
+
+if TYPE_CHECKING:
+    from airflow.executors.kubernetes_executor_types import (
+        KubernetesJobType,
+        KubernetesResultsType,
+        KubernetesWatchType,
+    )
+
+
+from airflow.executors.kubernetes_executor_types import ALL_NAMESPACES, 
POD_EXECUTOR_DONE_KEY
+
+
+class ResourceVersion:
+    """Singleton for tracking resourceVersion from Kubernetes."""
+
+    _instance: ResourceVersion | None = None
+    resource_version: dict[str, str] = {}
+
+    def __new__(cls):
+        if cls._instance is None:
+            cls._instance = super().__new__(cls)
+        return cls._instance
+
+
+class KubernetesJobWatcher(multiprocessing.Process, LoggingMixin):

Review Comment:
   What do folks think about moving the core executor into 
`kubernetes_executor/__init__.py`, and moving the stuff we are pulling out into 
files under that new folder? I'm not sure I love the `kubernetes_executor_x.py` 
pattern we are establishing here.



##########
airflow/executors/kubernetes_executor.py:
##########
@@ -773,6 +364,8 @@ def _change_state(
 
         # If we don't have a TI state, look it up from the db. event_buffer 
expects the TI state
         if state is None:
+            from airflow.models.taskinstance import TaskInstance

Review Comment:
   Does this import being local actually help? I find it hard to believe TI 
isn't already imported elsewhere when the scheduler is actually running.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to