raphaelauv opened a new issue, #27467:
URL: https://github.com/apache/airflow/issues/27467
### Apache Airflow Provider(s)
cncf-kubernetes
### Versions of Apache Airflow Providers
4.4.0
### Apache Airflow version
2.4.2
### Operating System
Ubuntu 22.04
### Deployment
Docker-Compose
### Deployment details
```json
"kubernetes_default": {
"conn_type": "kubernetes",
"extra": "{\"extra__kubernetes__in_cluster\": false,
\"extra__kubernetes__kube_config_path\": \"/opt/airflow/include/.kube/config\",
\"extra__kubernetes__namespace\": \"default\",
\"extra__kubernetes__cluster_context\": \"kind-kind\",
\"extra__kubernetes__disable_verify_ssl\": false,
\"extra__kubernetes__disable_tcp_keepalive\": false}"
}
```
### What happened
```log
[2022-11-02, 09:06:49 UTC] {kubernetes_pod.py:587} INFO - Creating pod
airflow-test-pod-8fd98700bb4c4850a83c0cb4ff030a17 with labels: {'dag_id':
'K8S_job_example', 'task_id': 'task-one', 'run_id':
'scheduled__2022-10-23T0000000000-6fd5e4439', 'kubernetes_pod_operator':
'True', 'try_number': '2'}
[2022-11-02, 09:06:49 UTC] {base.py:71} INFO - Using connection ID
'kubernetes_default' for task execution.
[2022-11-02, 09:06:49 UTC] {taskinstance.py:1851} ERROR - Task failed with
exception
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py",
line 417, in execute
self.pod = self.get_or_create_pod( # must set `self.pod` for `on_kill`
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py",
line 387, in get_or_create_pod
pod = self.find_pod(self.namespace or
pod_request_obj.metadata.namespace, context=context)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py",
line 369, in find_pod
pod_list = self.client.list_namespaced_pod(
File
"/home/airflow/.local/lib/python3.10/site-packages/kubernetes/client/api/core_v1_api.py",
line 15697, in list_namespaced_pod
return self.list_namespaced_pod_with_http_info(namespace, **kwargs) #
noqa: E501
File
"/home/airflow/.local/lib/python3.10/site-packages/kubernetes/client/api/core_v1_api.py",
line 15769, in list_namespaced_pod_with_http_info
raise ApiValueError("Missing the required parameter `namespace` when
calling `list_namespaced_pod`") # noqa: E501
kubernetes.client.exceptions.ApiValueError: Missing the required parameter
`namespace` when calling `list_namespaced_pod`
```
```python
from airflow import DAG
from datetime import timedelta
from pendulum import today
from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import
KubernetesPodOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': today("UTC").add(days=-1),
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
dag = DAG('K8S_job_example',
schedule_interval='0 0 * * *',
default_args=default_args)
with dag:
KubernetesPodOperator(
# namespace="default",
kubernetes_conn_id="kubernetes_default",
image="job_example:0.1",
container_resources=None,
name="airflow-test-pod",
task_id="task-one",
is_delete_operator_pod=True,
get_logs=True)
```
### What you think should happen instead
I should be able to not precise the namespace on my KPO if I already gave a
namespace in my K8S conn
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]