arjunanan6 commented on PR #29012:
URL: https://github.com/apache/airflow/pull/29012#issuecomment-1398288734

   Alright, so I attempted this locally where I have no restrictions. As 
discussed on #28394, scheduled tasks run just fine, but there is an exception 
when running a task manually:
   
   ```
   [2023-01-20T11:57:44.263+0000] {kubernetes_executor.py:527} INFO - Start 
Kubernetes executor
   [2023-01-20T11:57:44.306+0000] {kubernetes_executor.py:476} INFO - Found 0 
queued task instances
   [2023-01-20T11:57:44.309+0000] {base_executor.py:95} INFO - Adding to queue: 
['airflow', 'tasks', 'run', 'HELLO_WORLD', 'hello', 
'scheduled__2023-01-20T11:40:00+00:00', '--ignore-all-dependencies', 
'--ignore-dependencies', '--force', '--local', '--pool', 'default_pool', 
'--subdir', 'DAGS_FOLDER/hello.py']
   [2023-01-20T11:57:44.310+0000] {kubernetes_executor.py:559} INFO - Add task 
TaskInstanceKey(dag_id='HELLO_WORLD', task_id='hello', 
run_id='scheduled__2023-01-20T11:40:00+00:00', try_number=4, map_index=-1) with 
command ['airflow', 'tasks', 'run', 'HELLO_WORLD', 'hello', 
'scheduled__2023-01-20T11:40:00+00:00', '--ignore-all-dependencies', 
'--ignore-dependencies', '--force', '--local', '--pool', 'default_pool', 
'--subdir', 'DAGS_FOLDER/hello.py']
   [2023-01-20T11:57:44.310+0000] {kubernetes_executor.py:130} INFO - Event: 
and now my watch begins starting at resource_version: 0
   [2023-01-20T11:57:44.383+0000] {kubernetes_executor.py:339} INFO - Creating 
kubernetes pod for job is TaskInstanceKey(dag_id='HELLO_WORLD', 
task_id='hello', run_id='scheduled__2023-01-20T11:40:00+00:00', try_number=2, 
map_index=-1), with pod name hello-world-hello-be2bad2bd8dc4568bd1ba73082ecef4a
   [2023-01-20T11:57:44.392+0000] {kubernetes_executor.py:274} ERROR - 
Exception when attempting to create Namespaced Pod: {
     "apiVersion": "v1",
     "kind": "Pod",
     "metadata": {
       "annotations": {
         "dag_id": "HELLO_WORLD",
         "task_id": "hello",
         "try_number": "2",
         "run_id": "scheduled__2023-01-20T11:40:00+00:00"
       },
       "labels": {
         "tier": "airflow",
         "component": "worker",
         "release": "airflowlocal",
         "airflow-worker": "None",
         "dag_id": "HELLO_WORLD",
         "task_id": "hello",
         "try_number": "2",
         "airflow_version": "2.5.0",
         "kubernetes_executor": "True",
         "run_id": "scheduled__2023-01-20T1140000000-c15690dab"
       },
       "name": "hello-world-hello-be2bad2bd8dc4568bd1ba73082ecef4a",
       "namespace": "airflow"
     },
     "spec": {
       "affinity": {},
       "containers": [
         {
           "args": [
             "airflow",
             "tasks",
             "run",
             "HELLO_WORLD",
             "hello",
             "scheduled__2023-01-20T11:40:00+00:00",
             "--ignore-all-dependencies",
             "--ignore-dependencies",
             "--force",
             "--local",
             "--pool",
             "default_pool",
             "--subdir",
             "DAGS_FOLDER/hello.py"
           ],
           "env": [
             {
               "name": "AIRFLOW__CORE__EXECUTOR",
               "value": "LocalExecutor"
             },
             {
               "name": "AIRFLOW__CORE__FERNET_KEY",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "fernet-key",
                   "name": "airflowlocal-fernet-key"
                 }
               }
             },
             {
               "name": "AIRFLOW__CORE__SQL_ALCHEMY_CONN",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflowlocal-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW__DATABASE__SQL_ALCHEMY_CONN",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflowlocal-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW_CONN_AIRFLOW_DB",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "connection",
                   "name": "airflowlocal-airflow-metadata"
                 }
               }
             },
             {
               "name": "AIRFLOW__WEBSERVER__SECRET_KEY",
               "valueFrom": {
                 "secretKeyRef": {
                   "key": "webserver-secret-key",
                   "name": "airflowlocal-webserver-secret-key"
                 }
               }
             },
             {
               "name": "AIRFLOW_IS_K8S_EXECUTOR_POD",
               "value": "True"
             }
           ],
           "image": "my-dags:0.0.1",
           "imagePullPolicy": "IfNotPresent",
           "name": "base",
           "resources": {},
           "volumeMounts": [
             {
               "mountPath": "/opt/airflow/logs",
               "name": "logs"
             },
             {
               "mountPath": "/opt/airflow/airflow.cfg",
               "name": "config",
               "readOnly": true,
               "subPath": "airflow.cfg"
             },
             {
               "mountPath": "/opt/airflow/config/airflow_local_settings.py",
               "name": "config",
               "readOnly": true,
               "subPath": "airflow_local_settings.py"
             }
           ]
         }
       ],
       "restartPolicy": "Never",
       "securityContext": {
         "fsGroup": 0,
         "runAsUser": 50000
       },
       "serviceAccountName": "airflowlocal-worker",
       "volumes": [
         {
           "emptyDir": {},
           "name": "logs"
         },
         {
           "configMap": {
             "name": "airflowlocal-airflow-config"
           },
           "name": "config"
         }
       ]
     }
   }
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py",
 line 269, in run_pod_async
       resp = self.kube_client.create_namespaced_pod(
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api/core_v1_api.py",
 line 7356, in create_namespaced_pod
       return self.create_namespaced_pod_with_http_info(namespace, body, 
**kwargs)  # noqa: E501
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api/core_v1_api.py",
 line 7455, in create_namespaced_pod_with_http_info
       return self.api_client.call_api(
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py",
 line 348, in call_api
       return self.__call_api(resource_path, method,
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py",
 line 180, in __call_api
       response_data = self.request(
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py",
 line 391, in request
       return self.rest_client.POST(url,
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/rest.py", 
line 275, in POST
       return self.request("POST", url,
     File 
"/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/rest.py", 
line 234, in request
       raise ApiException(http_resp=r)
   kubernetes.client.exceptions.ApiException: (403)
   Reason: Forbidden
   HTTP response headers: HTTPHeaderDict({'Audit-Id': 
'c2225504-16da-4966-ae42-36241a0d49cb', 'Cache-Control': 'no-cache, private', 
'Content-Type': 'application/json', 'X-Content-Type-Options': 'nosniff', 
'X-Kubernetes-Pf-Flowschema-Uid': '3dd6f880-05b3-46ee-a0f2-a45a4c663a50', 
'X-Kubernetes-Pf-Prioritylevel-Uid': 'b8f0d7a8-7a33-4a3e-8e49-112cfe10ad1d', 
'Date': 'Fri, 20 Jan 2023 11:57:44 GMT', 'Content-Length': '299'})
   HTTP response body: 
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods
 is forbidden: User \"system:serviceaccount:airflow:airflowlocal-webserver\" 
cannot create resource \"pods\" in API group \"\" in the namespace 
\"airflow\"","reason":"Forbidden","details":{"kind":"pods"},"code":403}
   ```
   
   Which is strange, since this is the same service account that is spinning up 
pods to execute a scheduled task, here is a yaml of that pod definition:
   
   ```
   apiVersion: v1
   kind: Pod
   metadata:
     annotations:
       dag_id: HELLO_WORLD
       run_id: scheduled__2023-01-20T11:40:00+00:00
       task_id: hello
       try_number: "3"
     creationTimestamp: "2023-01-20T11:57:11Z"
     labels:
       airflow-worker: "80"
       airflow_version: 2.5.0
       component: worker
       dag_id: HELLO_WORLD
       kubernetes_executor: "True"
       release: airflowlocal
       run_id: scheduled__2023-01-20T1140000000-c15690dab
       task_id: hello
       tier: airflow
       try_number: "3"
     name: hello-world-hello-3c579ea84688467bab7036d3bc940c64
     namespace: airflow
     resourceVersion: "40588"
     uid: 53017417-d794-4532-bdb8-fa92db4f97fd
   spec:
     affinity: {}
     containers:
     - args:
       - airflow
       - tasks
       - run
       - HELLO_WORLD
       - hello
       - scheduled__2023-01-20T11:40:00+00:00
       - --local
       - --subdir
       - DAGS_FOLDER/hello.py
       env:
       - name: AIRFLOW__CORE__EXECUTOR
         value: LocalExecutor
       - name: AIRFLOW__CORE__FERNET_KEY
         valueFrom:
           secretKeyRef:
             key: fernet-key
             name: airflowlocal-fernet-key
       - name: AIRFLOW__CORE__SQL_ALCHEMY_CONN
         valueFrom:
           secretKeyRef:
             key: connection
             name: airflowlocal-airflow-metadata
       - name: AIRFLOW__DATABASE__SQL_ALCHEMY_CONN
         valueFrom:
           secretKeyRef:
             key: connection
             name: airflowlocal-airflow-metadata
       - name: AIRFLOW_CONN_AIRFLOW_DB
         valueFrom:
           secretKeyRef:
             key: connection
             name: airflowlocal-airflow-metadata
       - name: AIRFLOW__WEBSERVER__SECRET_KEY
         valueFrom:
           secretKeyRef:
             key: webserver-secret-key
             name: airflowlocal-webserver-secret-key
       - name: AIRFLOW_IS_K8S_EXECUTOR_POD
         value: "True"
       image: my-dags:0.0.1
       imagePullPolicy: IfNotPresent
       name: base
       resources: {}
       terminationMessagePath: /dev/termination-log
       terminationMessagePolicy: File
       volumeMounts:
       - mountPath: /opt/airflow/logs
         name: logs
       - mountPath: /opt/airflow/airflow.cfg
         name: config
         readOnly: true
         subPath: airflow.cfg
       - mountPath: /opt/airflow/config/airflow_local_settings.py
         name: config
         readOnly: true
         subPath: airflow_local_settings.py
       - mountPath: /var/run/secrets/kubernetes.io/serviceaccount
         name: kube-api-access-w5gcp
         readOnly: true
     dnsPolicy: ClusterFirst
     enableServiceLinks: true
     nodeName: kind-control-plane
     preemptionPolicy: PreemptLowerPriority
     priority: 0
     restartPolicy: Never
     schedulerName: default-scheduler
     securityContext:
       fsGroup: 0
       runAsUser: 50000
     serviceAccount: airflowlocal-worker
     serviceAccountName: airflowlocal-worker
     terminationGracePeriodSeconds: 30
     tolerations:
     - effect: NoExecute
       key: node.kubernetes.io/not-ready
       operator: Exists
       tolerationSeconds: 300
     - effect: NoExecute
       key: node.kubernetes.io/unreachable
       operator: Exists
       tolerationSeconds: 300
     volumes:
     - emptyDir: {}
       name: logs
     - configMap:
         defaultMode: 420
         name: airflowlocal-airflow-config
       name: config
     - name: kube-api-access-w5gcp
       projected:
         defaultMode: 420
         sources:
         - serviceAccountToken:
             expirationSeconds: 3607
             path: token
         - configMap:
             items:
             - key: ca.crt
               path: ca.crt
             name: kube-root-ca.crt
         - downwardAPI:
             items:
             - fieldRef:
                 apiVersion: v1
                 fieldPath: metadata.namespace
               path: namespace
   status:
     conditions:
     - lastProbeTime: null
       lastTransitionTime: "2023-01-20T11:57:11Z"
       reason: PodCompleted
       status: "True"
       type: Initialized
     - lastProbeTime: null
       lastTransitionTime: "2023-01-20T11:57:20Z"
       reason: PodCompleted
       status: "False"
       type: Ready
     - lastProbeTime: null
       lastTransitionTime: "2023-01-20T11:57:20Z"
       reason: PodCompleted
       status: "False"
       type: ContainersReady
     - lastProbeTime: null
       lastTransitionTime: "2023-01-20T11:57:11Z"
       status: "True"
       type: PodScheduled
     containerStatuses:
     - containerID: 
containerd://a81155f6103c7fee21ab8b06298d8ba04a112f91356685f2d6414dd68136eb3b
       image: docker.io/library/my-dags:0.0.1
       imageID: 
docker.io/library/import-2023-01-20@sha256:9298bf3504bc8279b2270f7d41e9d2c1244a39e22e0ac0c534c5e881d5621ca9
       lastState: {}
       name: base
       ready: false
       restartCount: 0
       started: false
       state:
         terminated:
           containerID: 
containerd://a81155f6103c7fee21ab8b06298d8ba04a112f91356685f2d6414dd68136eb3b
           exitCode: 0
           finishedAt: "2023-01-20T11:57:19Z"
           reason: Completed
           startedAt: "2023-01-20T11:57:11Z"
     hostIP: 172.18.0.2
     phase: Running
     podIP: 10.244.0.42
     podIPs:
     - ip: 10.244.0.42
     qosClass: BestEffort
     startTime: "2023-01-20T11:57:11Z"
   
   ```
   
   I verified whether this SA is able to spin up a pod, and that checks out too:
   
   ```
   kubectl auth can-i get pods --as 
system:serviceaccount:airflow:airflowlocal-webserver
   yes
   
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to