javi-cortes opened a new issue, #57743:
URL: https://github.com/apache/airflow/issues/57743

   ### Apache Airflow version
   
   3.1.1
   
   ### If "Other Airflow 2/3 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   Hi! 
   
   I’m using Airflow 3.1.0 / 3.1.1 with the KubernetesExecutor, and I’m 
overriding the pod via executor_config in default_args.
   
   This works perfectly when applied at task level, but when I set it at DAG 
level (via default_args), the UI (FastAPI) returns a 500 Internal Server Error 
because the response serializer can’t handle the V1Pod object.
   
   Error from api-server logs:
   `
   INFO:     10.101.11.64:44642 - "GET 
/api/v2/dags/pip_freeze_dag_override/details HTTP/1.1" 500 Internal Server Error
   ERROR:    Exception in ASGI application
   Traceback (most recent call last):
     [...]
     File 
"/home/airflow/.local/lib/python3.12/site-packages/fastapi/routing.py", line 
188, in serialize_response
       return field.serialize(
     File 
"/home/airflow/.local/lib/python3.12/site-packages/fastapi/_compat.py", line 
152, in serialize
       return self._type_adapter.dump_python(
     File 
"/home/airflow/.local/lib/python3.12/site-packages/pydantic/type_adapter.py", 
line 572, in dump_python
       return self.serializer.to_python(
   pydantic_core._pydantic_core.PydanticSerializationError: Unable to serialize 
unknown type: <class 'kubernetes.client.models.v1_pod.V1Pod'>
   `
   
   
   Task runs fine when pod_override is passed directly to the operator.
   
   UI / API call fails (/api/v2/dags/{dag_id}/details) because V1Pod is not 
JSON-serializable by Pydantic.
   
   Scheduler logs show a proper V1Pod object being used if the task is 
triggered manually.
   
   The serialization failure happens when the UI (or API) calls the 
/api/v2/dags/{dag_id}/details endpoint, which is implemented as:
   
   `
   @dags_router.get("/{dag_id}/details")
   def get_dag_details(...):
       dag = get_latest_version_of_dag(dag_bag, dag_id, session)
       dag_model = session.get(DagModel, dag_id)
       for key, value in dag.__dict__.items():
           if not key.startswith("_") and not hasattr(dag_model, key):
               setattr(dag_model, key, value)
       return dag_model
   `
   
   The returned dag_model is then serialized into a DAGDetailsResponse Pydantic 
schema.
   When the DAG contains executor_config={"pod_override": <V1Pod>}, the 
serializer hits a PydanticSerializationError because V1Pod isn’t 
JSON-serializable.
   If you sanitize it into a dict, the scheduler later fails,  since 
PodGenerator.from_obj() rejects non-V1Pod objects.
   PodGenerator only deserializes YAML templates (via 
deserialize_model_file()), so there’s currently no code path that handles this 
correctly for dicts returned by the API.
   
   ### What you think should happen instead?
   
   The API response should serialize cleanly even when 
executor_config["pod_override"] contains a V1Pod.
   Alternatively, Airflow could auto-sanitize or exclude V1Pod fields when 
building the DAG details response.
   
   ### How to reproduce
   
   Minimal Reproducible Example:
   
   `from airflow import DAG
   from airflow.operators.bash import BashOperator
   from datetime import datetime
   from kubernetes import client as k8s
   
   default_args = {
       "executor_config": {
           "pod_override": k8s.V1Pod(
               spec=k8s.V1PodSpec(
                   containers=[
                       k8s.V1Container(
                           name="base",
                           image="busybox",
                           image_pull_policy="IfNotPresent",
                       )
                   ],
                   restart_policy="Never",
               ),
           )
       }
   }
   
   with DAG(
       dag_id="example_pod_override_dag",
       start_date=datetime(2024, 1, 1),
       schedule=None,
       catchup=False,
       default_args=default_args,  # ← DAG-level
   ) as dag:
       BashOperator(
           task_id="pip_freeze",
           bash_command="python -m pip --version && python -m pip freeze",
           # inherits executor_config from default_args
       )
   `
   
   ### Operating System
   
   Linux, Ubuntu 24.04
   
   ### Versions of Apache Airflow Providers
   
   3.1.0
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   k8s official helm chart
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to