Smirn08 opened a new issue, #26008:
URL: https://github.com/apache/airflow/issues/26008

   ### Apache Airflow version
   
   2.3.4
   
   ### What happened
   
   I have tasks in K8s with KubernetesExecutor for which I need to use a unique 
pod label. In version 2.3.4, I started catching an error `Cannot convert a 
non-kubernetes.client.models.V1Pod object into a KubernetesExecutorConfig` when 
executing a task but pods are created successfully.
   
   
   ### What you think should happen instead
   
   
   I think this is the effect of 
[these](https://github.com/apache/airflow/pull/24356) changes, but I'm not sure.
   
   ```
   *** Reading local file: 
/airflow/logs/dag_id=my_dag/run_id=manual__2022-08-27T11:23:42.943920+00:00/task_id=example/attempt=1.log
   [2022-08-27, 14:24:20 MSK] {taskinstance.py:1171} INFO - Dependencies all 
met for <TaskInstance: my_dag.example manual__2022-08-27T11:23:42.943920+00:00 
[queued]>
   [2022-08-27, 14:24:20 MSK] {taskinstance.py:1171} INFO - Dependencies all 
met for <TaskInstance: my_dag.example manual__2022-08-27T11:23:42.943920+00:00 
[queued]>
   [2022-08-27, 14:24:20 MSK] {taskinstance.py:1368} INFO - 
   
--------------------------------------------------------------------------------
   [2022-08-27, 14:24:20 MSK] {taskinstance.py:1369} INFO - Starting attempt 1 
of 1
   [2022-08-27, 14:24:20 MSK] {taskinstance.py:1370} INFO - 
   
--------------------------------------------------------------------------------
   [2022-08-27, 14:24:20 MSK] {taskinstance.py:1389} INFO - Executing 
<Task(PythonOperator): example> on 2022-08-27 11:23:42.943920+00:00
   [2022-08-27, 14:24:20 MSK] {standard_task_runner.py:52} INFO - Started 
process 55 to run task
   [2022-08-27, 14:24:20 MSK] {standard_task_runner.py:79} INFO - Running: 
['airflow', 'tasks', 'run', 'my_dag', 'example', 
'manual__2022-08-27T11:23:42.943920+00:00', '--job-id', '11062', '--raw', 
'--subdir', 'DAGS_FOLDER/my_dag.py', '--cfg-path', '/tmp/tmpo2fb44af', 
'--error-file', '/tmp/tmp9y8fjhzy']
   [2022-08-27, 14:24:20 MSK] {standard_task_runner.py:80} INFO - Job 11062: 
Subtask example
   [2022-08-27, 14:24:21 MSK] {task_command.py:371} INFO - Running 
<TaskInstance: my_dag.example manual__2022-08-27T11:23:42.943920+00:00 
[running]> on host mydagexample -bf562e4f714543b0b1d8ee52c2e255ff
   [2022-08-27, 14:24:21 MSK] {taskinstance.py:1902} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/opt/conda/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 
1463, in _run_raw_task
       self._execute_task_with_callbacks(context, test_mode)
     File 
"/opt/conda/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 
1569, in _execute_task_with_callbacks
       rtif = RenderedTaskInstanceFields(ti=self, render_templates=False)
     File "<string>", line 4, in __init__
     File "/opt/conda/lib/python3.7/site-packages/sqlalchemy/orm/state.py", 
line 437, in _initialize_instance
       manager.dispatch.init_failure(self, args, kwargs)
     File 
"/opt/conda/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 
72, in __exit__
       with_traceback=exc_tb,
     File "/opt/conda/lib/python3.7/site-packages/sqlalchemy/util/compat.py", 
line 211, in raise_
       raise exception
     File "/opt/conda/lib/python3.7/site-packages/sqlalchemy/orm/state.py", 
line 434, in _initialize_instance
       return manager.original_init(*mixed[1:], **kwargs)
     File 
"/opt/conda/lib/python3.7/site-packages/airflow/models/renderedtifields.py", 
line 90, in __init__
       self.k8s_pod_yaml = ti.render_k8s_pod_yaml()
     File 
"/opt/conda/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 
2250, in render_k8s_pod_yaml
       pod_override_object=PodGenerator.from_obj(self.executor_config),
     File 
"/opt/conda/lib/python3.7/site-packages/airflow/kubernetes/pod_generator.py", 
line 180, in from_obj
       'Cannot convert a non-kubernetes.client.models.V1Pod object into a 
KubernetesExecutorConfig'
   TypeError: Cannot convert a non-kubernetes.client.models.V1Pod object into a 
KubernetesExecutorConfig
   [2022-08-27, 14:24:22 MSK] {taskinstance.py:1412} INFO - Marking task as 
FAILED. dag_id=my_dag, task_id=example, execution_date=20220827T112342, 
start_date=20220827T112420, end_date=20220827T112422
   [2022-08-27, 14:24:22 MSK] {standard_task_runner.py:97} ERROR - Failed to 
execute job 11062 for task example (Cannot convert a 
non-kubernetes.client.models.V1Pod object into a KubernetesExecutorConfig; 55)
   [2022-08-27, 14:24:22 MSK] {local_task_job.py:156} INFO - Task exited with 
return code 1
   [2022-08-27, 14:24:22 MSK] {local_task_job.py:279} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   ```
   
   ### How to reproduce
   
   I came to the conclusion that this error is due to the random name 
generation at "svc_name" variable.
   
   Because if you remove the variable "svc_name" or make it unchanged, 
everything is fine.
   It's not only related to the "metadata" parameter.
   
   I will give a simple example. 
   
   **template.py**
    ```python
   import uuid
   from kubernetes.client import models as k8s
   
   def get_executor_config() -> Dict[str, k8s.V1Pod]:
       svc_name = str(uuid.uuid4()).replace("-", "") # error
       # svc_name = "static_name" # it's fine
       executor_config = {
           "pod_override": k8s.V1Pod(
               metadata=k8s.V1ObjectMeta(labels={"app": svc_name}),
               spec=k8s.V1PodSpec(
                   containers=[
                       k8s.V1Container(name="base"),
                   ],
               ),
           ),
       }
       return executor_config
   ```
   
   **my_dag.py**
   ```python
   from airflow import DAG
   from airflow.utils.dates import days_ago
   from template import get_executor_config
   
   my_func():
       print("example")
   
   default_args = {
       "owner": "Airflow",
       "start_date": days_ago(1),
   }
   
   with DAG(dag_id="my_dag", default_args=default_args, schedule_interval=None) 
as dag:
   task = PythonOperator(
       task_id="example",
       python_callable=my_func(),
       executor_config=get_executor_config(),
       dag=dag,
   )
   ```
   
   ### Operating System
   
   Debian GNU/Linux 9 (stretch)
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-apache-hdfs==3.1.0
   apache-airflow-providers-apache-hive==4.0.0
   apache-airflow-providers-apache-spark==3.0.0
   apache-airflow-providers-cncf-kubernetes==4.3.0
   apache-airflow-providers-common-sql==1.1.0
   apache-airflow-providers-ftp==3.1.0
   apache-airflow-providers-http==4.0.0
   apache-airflow-providers-imap==3.0.0
   apache-airflow-providers-jdbc==3.2.0
   apache-airflow-providers-postgres==5.2.0
   apache-airflow-providers-sftp==4.0.0
   apache-airflow-providers-sqlite==3.2.0
   apache-airflow-providers-ssh==3.1.0
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   **kubectl version**
   ```
   Client Version: version.Info{Major:"1", Minor:"16", GitVersion:"v1.16.8", 
GitCommit:"ec6eb119b81be488b030e849b9e64fda4caaf33c", GitTreeState:"clean", 
BuildDate:"2020-03-12T21:00:06Z", GoVersion:"go1.13.8", Compiler:"gc", 
Platform:"linux/amd64"}
   Server Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.8", 
GitCommit:"35dc4cdc26cfcb6614059c4c6e836e5f0dc61dee", GitTreeState:"clean", 
BuildDate:"2020-06-26T03:36:03Z", GoVersion:"go1.13.9", Compiler:"gc", 
Platform:"linux/amd64"}
   ```
   **Python 3.7.7** and all Airflow constraints
   **DB** - PostgreSQL v12
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to