valayDave opened a new issue, #24669:
URL: https://github.com/apache/airflow/issues/24669

   ### Apache Airflow version
   
   2.3.2 (latest released)
   
   ### What happened
   
   I am using the `KubernetesPodOperator` and I set the `resources` keyword 
argument in the `partial` function so that I can use dynamic task mapping. This 
doesn't work and breaks the code. The following is the errors It is giving : 
   
   ```
   Broken DAG: [/dags/foreach-mapper-fail.py] Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py",
 line 287, in partial
       partial_kwargs["resources"] = 
coerce_resources(partial_kwargs["resources"])
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py",
 line 133, in coerce_resources
       return Resources(**resources)
   TypeError: type object argument after ** must be a mapping, not 
V1ResourceRequirements
   ```
   
   ### What you think should happen instead
   
   I suspect it is because `BaseOperator` is explicitly given `None` in the 
`resource` 
[argument](https://github.com/apache/airflow/blob/8a34d25049a060a035d4db4a49cd4a0d0b07fb0b/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py#L217)
   
   ### How to reproduce
   
   Here is an example DAG to reproduce. Change the `namespace` in the 
`KubernetesPodOperator`: 
   
   ```python
   from datetime import timedelta
   from airflow import DAG
   from airflow.contrib.operators.kubernetes_pod_operator import 
KubernetesPodOperator
   from airflow.operators.bash import BashOperator
   from airflow.utils.dates import days_ago
   from kubernetes import client
   from airflow.models.param import Param
   
   default_args = {
       "owner": "Airflow",
       "depends_on_past": False,
       "start_date": days_ago(0),
       "email": ["[email protected]"],
       "email_on_failure": False,
       "email_on_retry": False,
       "retries": 0,
       "retry_delay": timedelta(minutes=5),
   }
   
   dag = DAG(
       "MapperK8sOperator-3",
       default_args=default_args,
       schedule_interval=timedelta(minutes=30),
       max_active_runs=1,
       user_defined_filters={"to_list": lambda x: list(x)},
       concurrency=10,
       params={
           "param1": Param(10, type="integer", minimum=0, maximum=20),
           "param2": "value2",
       },
   )
   
   # Generate 2 tasks
   tasks = ["task{}".format(i) for i in range(1, 3)]
   example_dag_complete_node = BashOperator(
       task_id="example_dag_complete",
       dag=dag,
       bash_command="echo 
{{task_instance.xcom_pull(task_ids='task1',key='pod_name') | to_list }}",
   )
   
   org_dags = []
   
   
   def make_cmds():
       return [
           [
               "mkdir -p /airflow/xcom/;echo '[1,2,3,4,6]' > 
/airflow/xcom/return.json;"
           ],
           [
               "mkdir -p /airflow/xcom/;echo '[1,2,3,4,109]' > 
/airflow/xcom/return.json;"
           ],
       ]
   
   
   for task in tasks:
   
       bash_command = "echo HELLO"
   
       org_node = KubernetesPodOperator.partial(
           namespace="my-namespace",
           image="python",
           labels={"foo": "bar"},
           image_pull_policy="Always",
           name=task,
           task_id=task,
           is_delete_operator_pod=True,
           resources = client.V1ResourceRequirements(requests={
               "cpu": 1,
               "memory": "256M",
           }),
           do_xcom_push=True,
           get_logs=True,
           dag=dag,
       ).expand(cmds=make_cmds())
   
       org_node.set_downstream(example_dag_complete_node)
       org_node >> example_dag_complete_node
   
   ```
   
   ### Operating System
   
   Debian GNU/Linux
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-amazon==3.4.0
   apache-airflow-providers-celery==2.1.4
   apache-airflow-providers-cncf-kubernetes==4.0.2
   apache-airflow-providers-docker==2.7.0
   apache-airflow-providers-elasticsearch==3.0.3
   apache-airflow-providers-ftp==2.1.2
   apache-airflow-providers-google==7.0.0
   apache-airflow-providers-grpc==2.0.4
   apache-airflow-providers-hashicorp==2.2.0
   apache-airflow-providers-http==2.1.2
   apache-airflow-providers-imap==2.2.3
   apache-airflow-providers-microsoft-azure==3.9.0
   apache-airflow-providers-mysql==2.2.3
   apache-airflow-providers-odbc==2.0.4
   apache-airflow-providers-postgres==4.1.0
   apache-airflow-providers-redis==2.0.4
   apache-airflow-providers-sendgrid==2.0.4
   apache-airflow-providers-sftp==2.6.0
   apache-airflow-providers-slack==4.2.3
   apache-airflow-providers-sqlite==2.1.3
   apache-airflow-providers-ssh==2.4.4
   ```
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   The same example given here works without the `resources` keyword argument. 
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to