[jira] [Created] (AIRFLOW-3529) Create the ability to test the kubernetes executor on an existing cluster
Daniel Imberman created AIRFLOW-3529: Summary: Create the ability to test the kubernetes executor on an existing cluster Key: AIRFLOW-3529 URL: https://issues.apache.org/jira/browse/AIRFLOW-3529 Project: Apache Airflow Issue Type: Improvement Components: kubernetes Reporter: Daniel Imberman Assignee: Daniel Imberman Currently all integration testing for the kubernetes executor takes place on minikube. This is simultaneously a lot slower and not accurate to real-world use-cases. This PR will create a one-step script to run on a real k8s cluster and write documentation for easier onboarding. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3505) Change name of 'dags_in_docker' field to 'dags
[ https://issues.apache.org/jira/browse/AIRFLOW-3505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-3505: - Summary: Change name of 'dags_in_docker' field to 'dags (was: Change name of 'image) > Change name of 'dags_in_docker' field to 'dags > -- > > Key: AIRFLOW-3505 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3505 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Daniel Imberman >Priority: Minor > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3505) Change name of 'dags_in_docker' field to 'dags_in_image'
[ https://issues.apache.org/jira/browse/AIRFLOW-3505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-3505: - Description: As kubernetes is moving away from docker to OCI, it will be more correct to use the 'dags_in_image' name to be more container system agnostic > Change name of 'dags_in_docker' field to 'dags_in_image' > > > Key: AIRFLOW-3505 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3505 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Daniel Imberman >Priority: Minor > > As kubernetes is moving away from docker to OCI, it will be more correct to > use the 'dags_in_image' name to be more container system agnostic -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-3505) Change name of 'image
Daniel Imberman created AIRFLOW-3505: Summary: Change name of 'image Key: AIRFLOW-3505 URL: https://issues.apache.org/jira/browse/AIRFLOW-3505 Project: Apache Airflow Issue Type: Improvement Reporter: Daniel Imberman -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3505) Change name of 'dags_in_docker' field to 'dags_in_image'
[ https://issues.apache.org/jira/browse/AIRFLOW-3505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-3505: - Summary: Change name of 'dags_in_docker' field to 'dags_in_image' (was: Change name of 'dags_in_docker' field to 'dags) > Change name of 'dags_in_docker' field to 'dags_in_image' > > > Key: AIRFLOW-3505 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3505 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Daniel Imberman >Priority: Minor > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-3484) The KubernetesExecutor is overly verbose and this can kill schedulers.
Daniel Imberman created AIRFLOW-3484: Summary: The KubernetesExecutor is overly verbose and this can kill schedulers. Key: AIRFLOW-3484 URL: https://issues.apache.org/jira/browse/AIRFLOW-3484 Project: Apache Airflow Issue Type: Bug Components: kubernetes Affects Versions: 1.10.1 Reporter: Daniel Imberman Assignee: Daniel Imberman There are two log lines in the k8sexecutor that can cause schedulers to crash just by their sheer verbosity. This PR will switch these lines to debug as to not mess with normal workflows. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-2955) Kubernetes pod operator: Unable to set requests/limits on task pods
[ https://issues.apache.org/jira/browse/AIRFLOW-2955?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596934#comment-16596934 ] Daniel Imberman commented on AIRFLOW-2955: -- [~jpds] So the problem here is that the operator is expecting a "Resources" class [https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/kubernetes/pod.py#L19.] It might actually make more sense to have users just use a dict and then generate that class ourselves. Should be a pretty easy fix. Until then try creating that class and it should work. > Kubernetes pod operator: Unable to set requests/limits on task pods > --- > > Key: AIRFLOW-2955 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2955 > Project: Apache Airflow > Issue Type: Bug >Reporter: Jon Davies >Priority: Major > > When I try and set a resource limit/request on a DAG task with the > KubernetesPodOperator as follows: > {code:java} > resources={"limit_cpu": 1, "request_cpu": 1}, > {code} > ...I get: > {code:java} > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task Traceback (most recent call last): > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/bin/airflow", line 32, in > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task args.func(args) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", > line 74, in wrapper > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task return f(*args, **kwargs) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line > 498, in run > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task _run(args, dag, ti) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line > 402, in _run > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task pool=args.pool, > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", > line 74, in wrapper > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task return func(*args, **kwargs) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/models.py", line > 1633, in _run_raw_task > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task result = task_copy.execute(context=context) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", > line 115, in execute > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task get_logs=self.get_logs) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/pod_launcher.py", > line 71, in run_pod > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task resp = self.run_pod_async(pod) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/pod_launcher.py", > line 52, in run_pod_async > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task req = self.kube_req_factory.create(pod) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/kubernetes_request_factory/pod_request_factory.py", > line 56, in create > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task self.extract_resources(pod, req) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/kubernetes_request_factory/kubernetes_request_factory.py", > line 160, in extract_resources > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task if not pod.resources or pod.resources.is_empty_resource_request(): > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task AttributeError: 'dict' object has no attribute > 'is_empty_resource_request' > {code} > ...setting >
[jira] [Commented] (AIRFLOW-2955) Kubernetes pod operator: Unable to set requests/limits on task pods
[ https://issues.apache.org/jira/browse/AIRFLOW-2955?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596593#comment-16596593 ] Daniel Imberman commented on AIRFLOW-2955: -- Hi [~jpds]. Thank you for bringing this to my attention. Could you please post the task so I can get a slightly better idea where this bug is? > Kubernetes pod operator: Unable to set requests/limits on task pods > --- > > Key: AIRFLOW-2955 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2955 > Project: Apache Airflow > Issue Type: Bug >Reporter: Jon Davies >Priority: Major > > When I try and set a resource limit/request on a DAG task with the > KubernetesPodOperator as follows: > {code:java} > resources={"limit_cpu": 1, "request_cpu": 1}, > {code} > ...I get: > {code:java} > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task Traceback (most recent call last): > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/bin/airflow", line 32, in > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task args.func(args) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", > line 74, in wrapper > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task return f(*args, **kwargs) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line > 498, in run > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task _run(args, dag, ti) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line > 402, in _run > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task pool=args.pool, > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", > line 74, in wrapper > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task return func(*args, **kwargs) > [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask > task File "/usr/local/lib/python3.7/site-packages/airflow/models.py", line > 1633, in _run_raw_task > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task result = task_copy.execute(context=context) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", > line 115, in execute > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task get_logs=self.get_logs) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/pod_launcher.py", > line 71, in run_pod > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task resp = self.run_pod_async(pod) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/pod_launcher.py", > line 52, in run_pod_async > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task req = self.kube_req_factory.create(pod) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/kubernetes_request_factory/pod_request_factory.py", > line 56, in create > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task self.extract_resources(pod, req) > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task File > "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/kubernetes_request_factory/kubernetes_request_factory.py", > line 160, in extract_resources > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task if not pod.resources or pod.resources.is_empty_resource_request(): > [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask > task AttributeError: 'dict' object has no attribute > 'is_empty_resource_request' > {code} > ...setting > https://github.com/apache/incubator-airflow/blob/fc10f7e0a04145a0b2f31f8d0990bbe900b4e8a2/airflow/example_dags/example_kubernetes_executor.py#L66 > works, however that only adjusts the metadata for the worker pod and not the > pod ultimately used for the task. -- This
[jira] [Created] (AIRFLOW-2952) Dockerized CI pipeline has silently broken integration testing for KubernetesExecutor
Daniel Imberman created AIRFLOW-2952: Summary: Dockerized CI pipeline has silently broken integration testing for KubernetesExecutor Key: AIRFLOW-2952 URL: https://issues.apache.org/jira/browse/AIRFLOW-2952 Project: Apache Airflow Issue Type: Bug Reporter: Daniel Imberman Assignee: Daniel Imberman [~gcuriel] [~bolke] [~Fokko] Looking at all recent builds the new CI pipeline is silently reverting the kubernetes tests to the normal airflow tests. Before https://travis-ci.org/apache/incubator-airflow/jobs/418914949#L1007 After: [https://travis-ci.org/apache/incubator-airflow/jobs/419062412#L4970] This means that kubernetes builds will pass without actually testing on a kubernetes cluster. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2894) Allow Users to "bake-in" DAGs in Airflow images
Daniel Imberman created AIRFLOW-2894: Summary: Allow Users to "bake-in" DAGs in Airflow images Key: AIRFLOW-2894 URL: https://issues.apache.org/jira/browse/AIRFLOW-2894 Project: Apache Airflow Issue Type: New Feature Reporter: Daniel Imberman Assignee: Daniel Imberman Multiple Users have asked that we offer the ability to have DAGs baked in to their airflow images at launch (as opposed to using git-mode or a volume claim). This will save start-up time and allow for versioned DAGs via docker. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2714) Kubernetes Deployment Operator
Daniel Imberman created AIRFLOW-2714: Summary: Kubernetes Deployment Operator Key: AIRFLOW-2714 URL: https://issues.apache.org/jira/browse/AIRFLOW-2714 Project: Apache Airflow Issue Type: New Feature Reporter: Daniel Imberman Assignee: Daniel Imberman *{color:#212121}What?{color}* {color:#212121} Add an operator that monitors a k8s deployment, declaring the task{color} {color:#212121}complete on proper deployment/accessibility of endpoint{color} *{color:#212121}Why?{color}* {color:#212121} Not all tasks are single pods, sometimes you would want to run one task{color} {color:#212121}that launches a service, and then a second task that smoke tests/stress{color} {color:#212121}tests/{color} {color:#212121} gives state to an application deployment. This would give airflow extra{color} {color:#212121}functionality as a CI/CD tool in the k8s ecosystem.{color} {color:#212121}*Fix*:{color} {color:#212121} Create a modification (or extension) of the k8sPodOperator that can handle{color} {color:#212121}entire deployments (possibly using the k8s model API to ensure{color} {color:#212121}full flexibility of users). An example of creating a deployment using the k8s model architecture can be found here: [https://github.com/kubernetes-client/python/blob/master/examples/deployment_examples.py] {color} {code:java} def create_deployment_object(): # Configureate Pod template container container = client.V1Container( name="nginx", image="nginx:1.7.9", ports=[client.V1ContainerPort(container_port=80)]) # Create and configurate a spec section template = client.V1PodTemplateSpec( metadata=client.V1ObjectMeta(labels={"app": "nginx"}), spec=client.V1PodSpec(containers=[container])) # Create the specification of deployment spec = client.ExtensionsV1beta1DeploymentSpec( replicas=3, template=template) # Instantiate the deployment object deployment = client.ExtensionsV1beta1Deployment( api_version="extensions/v1beta1", kind="Deployment", metadata=client.V1ObjectMeta(name=DEPLOYMENT_NAME), spec=spec) {code} {color:#212121} return deployment This would involve a more k8s knowledge from the user, but would have the massive benefit that we would not have to maintain new features as the k8s API updates (Would simply update version). A user would have to supply is a deployment object and possibly a "success criteria" (i.e. an endpoint to test). Conversely, we could make the API a bit easier by only requiring a spec and an optional metadata, after which we would handle a lot of the boilerplate. {color} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-2460) KubernetesPodOperator should be able to attach to volume mounts and configmaps
[ https://issues.apache.org/jira/browse/AIRFLOW-2460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-2460: - Issue Type: Bug (was: New Feature) > KubernetesPodOperator should be able to attach to volume mounts and configmaps > -- > > Key: AIRFLOW-2460 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2460 > Project: Apache Airflow > Issue Type: Bug >Reporter: Daniel Imberman >Assignee: Daniel Imberman >Priority: Major > > In order to run tasks using the KubernetesPodOperator in a production > setting, users need to be able to access pre-existing data through > PersistentVolumes or ConfigMaps. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2460) KubernetesPodOperator should be able to attach to volume mounts and configmaps
Daniel Imberman created AIRFLOW-2460: Summary: KubernetesPodOperator should be able to attach to volume mounts and configmaps Key: AIRFLOW-2460 URL: https://issues.apache.org/jira/browse/AIRFLOW-2460 Project: Apache Airflow Issue Type: New Feature Reporter: Daniel Imberman Assignee: Daniel Imberman In order to run tasks using the KubernetesPodOperator in a production setting, users need to be able to access pre-existing data through PersistentVolumes or ConfigMaps. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2450) Upgrade supported k8s versions in Airflow
Daniel Imberman created AIRFLOW-2450: Summary: Upgrade supported k8s versions in Airflow Key: AIRFLOW-2450 URL: https://issues.apache.org/jira/browse/AIRFLOW-2450 Project: Apache Airflow Issue Type: Bug Reporter: Daniel Imberman Assignee: Daniel Imberman To maintain the two most recent releases of k8s, we should test for k8s 1.9 and 1.10 -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2424) Add dagrun status endpoint and increased k8s test coverage
Daniel Imberman created AIRFLOW-2424: Summary: Add dagrun status endpoint and increased k8s test coverage Key: AIRFLOW-2424 URL: https://issues.apache.org/jira/browse/AIRFLOW-2424 Project: Apache Airflow Issue Type: Bug Reporter: Daniel Imberman Assignee: Daniel Imberman In line with @Fokko's k8s testing. I think it adds value to have a "dagrun_status" endpoint so we can determine if a dag run with the k8s executor finishes completely. I have also added a test for whether a dag will finish correctly even if the airflow pod is deleted. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2335) Issue downloading oracle jdk8 is preventing travis builds from running
Daniel Imberman created AIRFLOW-2335: Summary: Issue downloading oracle jdk8 is preventing travis builds from running Key: AIRFLOW-2335 URL: https://issues.apache.org/jira/browse/AIRFLOW-2335 Project: Apache Airflow Issue Type: Bug Reporter: Daniel Imberman Assignee: Daniel Imberman Currently, all airflow build are dying after ~1 minute due to an issue with how travis pulls jdk8 -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Work started] (AIRFLOW-2006) Add log catching capability to kubernetes operator
[ https://issues.apache.org/jira/browse/AIRFLOW-2006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on AIRFLOW-2006 started by Daniel Imberman. > Add log catching capability to kubernetes operator > -- > > Key: AIRFLOW-2006 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2006 > Project: Apache Airflow > Issue Type: Sub-task >Reporter: Daniel Imberman >Assignee: Daniel Imberman >Priority: Minor > > For the kubernetes operator, we can use the kubernetes logging API to gather > logs into the central airflow instance so they show up on the UI -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-2006) Add log catching capability to kubernetes operator
Daniel Imberman created AIRFLOW-2006: Summary: Add log catching capability to kubernetes operator Key: AIRFLOW-2006 URL: https://issues.apache.org/jira/browse/AIRFLOW-2006 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Daniel Imberman For the kubernetes operator, we can use the kubernetes logging API to gather logs into the central airflow instance so they show up on the UI -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Work started] (AIRFLOW-1999) Service account integration for kubernetes executor
[ https://issues.apache.org/jira/browse/AIRFLOW-1999?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on AIRFLOW-1999 started by Daniel Imberman. > Service account integration for kubernetes executor > --- > > Key: AIRFLOW-1999 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1999 > Project: Apache Airflow > Issue Type: Sub-task > Components: contrib >Reporter: Daniel Imberman >Assignee: Daniel Imberman >Priority: Minor > Fix For: Airflow 2.0 > > > Add service account integrations to kubernetes executors. This will use > custom initializers to allow users to decide service account permissions on a > per-task basis. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1999) Service account integration for kubernetes executor
Daniel Imberman created AIRFLOW-1999: Summary: Service account integration for kubernetes executor Key: AIRFLOW-1999 URL: https://issues.apache.org/jira/browse/AIRFLOW-1999 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Feng Lu Priority: Minor Add service account integrations to kubernetes executors. This will use custom initializers to allow users to decide service account permissions on a per-task basis. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (AIRFLOW-1899) Airflow Kubernetes Executor [basic]
[ https://issues.apache.org/jira/browse/AIRFLOW-1899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-1899: - Summary: Airflow Kubernetes Executor [basic] (was: Airflow Kubernetes Executor) > Airflow Kubernetes Executor [basic] > --- > > Key: AIRFLOW-1899 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1899 > Project: Apache Airflow > Issue Type: Sub-task > Components: contrib >Reporter: Daniel Imberman >Assignee: Daniel Imberman > Fix For: Airflow 2.0 > > > The basic Kubernetes Executor PR should launch basic pods using the same pod > launcher as the kubernetes operator. This PR should not concern itself with a > lot of of the extra features which can be added in future PRs. a successful > PR for this issue should be able to launch a pod, watch using the watcher > API, and track failures/successes. Should also include basic testing for the > executor using [~grantnicholas]'s testing library. cc: [~benjigoldberg] > [~bolke] -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (AIRFLOW-1899) Airflow Kubernetes Executor
[ https://issues.apache.org/jira/browse/AIRFLOW-1899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-1899: - Description: The basic Kubernetes Executor PR should launch basic pods using the same pod launcher as the kubernetes operator. This PR should not concern itself with a lot of of the extra features which can be added in future PRs. a successful PR for this issue should be able to launch a pod, watch using the watcher API, and track failures/successes. Should also include basic testing for the executor using [~grantnicholas]'s testing library. cc: [~benjigoldberg] [~bolke] > Airflow Kubernetes Executor > --- > > Key: AIRFLOW-1899 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1899 > Project: Apache Airflow > Issue Type: Sub-task > Components: contrib >Reporter: Daniel Imberman >Assignee: Daniel Imberman > Fix For: Airflow 2.0 > > > The basic Kubernetes Executor PR should launch basic pods using the same pod > launcher as the kubernetes operator. This PR should not concern itself with a > lot of of the extra features which can be added in future PRs. a successful > PR for this issue should be able to launch a pod, watch using the watcher > API, and track failures/successes. Should also include basic testing for the > executor using [~grantnicholas]'s testing library. cc: [~benjigoldberg] > [~bolke] -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1961) Create kubernetes watcher for Kubernetes executor to track pod failures
Daniel Imberman created AIRFLOW-1961: Summary: Create kubernetes watcher for Kubernetes executor to track pod failures Key: AIRFLOW-1961 URL: https://issues.apache.org/jira/browse/AIRFLOW-1961 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Daniel Imberman -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1960) Add kubernetes secrets to airflow kubernetes operator/executor
Daniel Imberman created AIRFLOW-1960: Summary: Add kubernetes secrets to airflow kubernetes operator/executor Key: AIRFLOW-1960 URL: https://issues.apache.org/jira/browse/AIRFLOW-1960 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Benjamin Goldberg -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1959) Create minikube testing library for testing kubernetes executor/operator
Daniel Imberman created AIRFLOW-1959: Summary: Create minikube testing library for testing kubernetes executor/operator Key: AIRFLOW-1959 URL: https://issues.apache.org/jira/browse/AIRFLOW-1959 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Grant Nicholas -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (AIRFLOW-1314) Airflow kubernetes integration
[ https://issues.apache.org/jira/browse/AIRFLOW-1314?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-1314: - Priority: Major (was: Minor) > Airflow kubernetes integration > -- > > Key: AIRFLOW-1314 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1314 > Project: Apache Airflow > Issue Type: Improvement > Components: contrib >Affects Versions: Airflow 2.0 >Reporter: Daniel Imberman >Assignee: Daniel Imberman > Labels: features > Fix For: Airflow 2.0 > > > Kubernetes is a container-based cluster management system designed by google > for easy application deployment. Companies such as Airbnb, Bloomberg, > Palantir, and Google use kubernetes for a variety of large-scale solutions > including data science, ETL, and app deployment. Integrating airflow into > Kubernetes would increase viable use cases for airflow, promote airflow as a > de facto workflow scheduler for Kubernetes, and create possibilities for > improved security and robustness within airflow. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1899) Airflow Kubernetes Executor
Daniel Imberman created AIRFLOW-1899: Summary: Airflow Kubernetes Executor Key: AIRFLOW-1899 URL: https://issues.apache.org/jira/browse/AIRFLOW-1899 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Daniel Imberman -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Updated] (AIRFLOW-1517) Create Kubernetes Operator Only PR
[ https://issues.apache.org/jira/browse/AIRFLOW-1517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Imberman updated AIRFLOW-1517: - Priority: Minor (was: Major) > Create Kubernetes Operator Only PR > -- > > Key: AIRFLOW-1517 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1517 > Project: Apache Airflow > Issue Type: Sub-task > Components: contrib >Reporter: Daniel Imberman >Assignee: Daniel Imberman >Priority: Minor > Fix For: Airflow 2.0 > > > To reduce the size of the PR and create early momentum for the > airflow-kubernetes system, we will start out with a smaller PR that only > creates a number of kubernetes-based operators. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1517) Create Kubernetes Operator Only PR
Daniel Imberman created AIRFLOW-1517: Summary: Create Kubernetes Operator Only PR Key: AIRFLOW-1517 URL: https://issues.apache.org/jira/browse/AIRFLOW-1517 Project: Apache Airflow Issue Type: Sub-task Reporter: Daniel Imberman Assignee: Daniel Imberman To reduce the size of the PR and create early momentum for the airflow-kubernetes system, we will start out with a smaller PR that only creates a number of kubernetes-based operators. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1314) Airflow kubernetes integration
Daniel Imberman created AIRFLOW-1314: Summary: Airflow kubernetes integration Key: AIRFLOW-1314 URL: https://issues.apache.org/jira/browse/AIRFLOW-1314 Project: Apache Airflow Issue Type: Improvement Components: contrib Affects Versions: Airflow 2.0 Reporter: Daniel Imberman Assignee: Daniel Imberman Priority: Minor Fix For: Airflow 2.0 Kubernetes is a container-based cluster management system designed by google for easy application deployment. Companies such as Airbnb, Bloomberg, Palantir, and Google use kubernetes for a variety of large-scale solutions including data science, ETL, and app deployment. Integrating airflow into Kubernetes would increase viable use cases for airflow, promote airflow as a de facto workflow scheduler for Kubernetes, and create possibilities for improved security and robustness within airflow. -- This message was sent by Atlassian JIRA (v6.4.14#64029)